Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
5 views9 pages

HCI Notes-Unit 4

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views9 pages

HCI Notes-Unit 4

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Unit 4

IMPLEMENTATION SUPPORT

Implementation support

● Programming Tools- levels of services for programmers

● Windowing Systems- core support for separate and simultaneous user system activity.

● Programming the application and control of dialogue

● Interaction toolkits- bring programming closer to level of user perception

● User interface management systems- controls relationship between presentation.

How does HCI affect the programmer?

Advances in coding have elevated programming.

Hardware specific

- interaction-technique specific

Layers of development tools

● windowing systems

● interaction toolkits

● user interface management systems

Device independence

- programming the abstract terminal device drivers image models for output

and (partially) input.

● Pixels

● PostScript (MacOS X, NextStep)

● Graphical Kernel System (GKS)

● Programmers' Hierarchical Interface to Graphics (PHIGS)

Architectures of Windowing Systems

1. Three possible software architectures

- Each application manages all processes

- everyone worries about synchronization

- reduces portability of applications

2. Management role within kernel of operating system


- applications tied to operating system

3. Management role as separate application maximum portability

System Style Affects The Interfaces

● Modal Dialogue Box

- easy with event-loop (just have extra read-event loop)

- hard with notification (need lots of mode flags)

● Non-modal Dialogue Box

- hard with event-loop (very complicated main loop)

- easy with notification (just add extra handler)

Tool Kit

To aid the programmer in fusing input and output behaviors, another level of abstraction is placed on
top of the window system – the toolkit. A toolkit provides the programmer with a set of ready-made
interaction objects – alternatively called interaction techniques, gadgets or widgets – which she can use
to create her application programs.

Evaluation techniques in Human-Computer Interaction (HCI) assess the usability, effectiveness, and user
experience of interactive systems. Here are some common techniques:

User-Centered Evaluation Techniques

1. Usability Testing: Observing users interacting with a system to identify usability issues.

2. User Interviews: Gathering user feedback and opinions through structured or semi-structured
interviews.

3. Surveys and Questionnaires: Collecting user feedback through online or offline surveys.

4. Think-Aloud Protocol: Asking users to verbalize their thoughts while interacting with a system.

Expert-Based Evaluation Techniques

1. Heuristic Evaluation: Evaluating a system against established usability principles (heuristics).

2. Cognitive Walkthrough: Analyzing a system's usability by walking through a task scenario.

3. Expert Review: Having an expert evaluate a system's usability and user experience.
Quantitative Evaluation Techniques

1. Performance Metrics: Measuring user performance (e.g., task completion time, error rate).

2. Eye-Tracking: Analyzing user gaze and attention.

3. Log File Analysis: Analyzing user behavior through system logs.

Qualitative Evaluation Techniques

1. Contextual Inquiry: Observing users in their natural environment.

2. Focus Groups: Gathering user feedback through group discussions.

3. Content Analysis: Analyzing user-generated content (e.g., feedback, reviews).

These techniques help HCI professionals identify usability issues, improve user experience, and ensure
that interactive systems meet user needs.

The goals of Human-Computer Interaction (HCI) include:

1. Enhance User Experience: Create intuitive, engaging, and satisfying interactions.

2. Improve Usability: Ensure systems are easy to learn, use, and remember.

3. Increase Accessibility: Make technology accessible to diverse users, including those with disabilities.

4. Boost Productivity: Design systems that support efficient and effective task completion.

5. Foster User Engagement: Encourage user participation, motivation, and enjoyment.

6. Ensure Safety and Security: Design systems that protect user data and prevent harm.

7. Support Human Needs: Understand and address user needs, behaviors, and preferences.

By achieving these goals, HCI aims to create technology that is:

1. User-centered
2. Intuitive

3. Effective

4. Enjoyable

5. Accessible

Ultimately, HCI seeks to improve human lives through technology.

Expert analysis:

Usability evaluation: Experts assess a system's usability, identifying issues and areas for improvement.
Interface inspection: Experts examine the interface, checking for consistency, clarity, and adherence to
design principles. Heuristic evaluation: Experts evaluate a system against established usability heuristics
(principles).

Benefits:

Early feedback: Experts provide feedback early in the design process. Improved usability: Expert analysis
helps identify and fix usability issues. Cost-effective: Expert analysis can be more efficient than user
testing.

Expert analysis is a valuable method for improving the usability and user experience of interactive
systems.

User Research & Behavioral Methods

1. Usability Testing

2. Think-Aloud Protocols

3. Contextual Inquiry

4. Interviews

5. Surveys & Questionnaires

6. Focus Groups

7. Ethnographic Studies

8. Diary Studies

9. A/B Testing

10. Field Studies

11. Controlled Experiments

12. Observational Studies

13. Task Analysis


14. Cognitive Walkthrough

15. Heuristic Evaluation

16. Participatory Design

17. Personas

18. User Journey Mapping

19. Empathy Mapping

20. Card Sorting

Universal Design

Universal Design is an approach to designing products, environments, and interfaces that


are accessible and usable by everyone, regardless of age or ability. The concept of
Universal Design emerged in the 1970s and 1980s, primarily through the work of Ronald
Mace, a architect and designer who advocated for designing buildings and products that
were accessible to people with disabilities 1. Since then, the principles of Universal
Design have been widely adopted in various fields, including Human-Computer
Interaction (HCI).

Definition and History of Universal Design

Universal Design is defined as "the design of products and environments to be usable by


all people, to the greatest extent possible, without the need for adaptation or specialized
design" 2. The history of Universal Design is rooted in the disability rights movement,
which sought to challenge traditional design practices that often excluded people with
disabilities. Over time, the principles of Universal Design have evolved to encompass a
broader range of user needs and abilities

Importance of Universal Design in HCI

In HCI, Universal Design is crucial for creating interfaces that are accessible and usable
by diverse user groups. With the increasing use of technology in everyday life, it is
essential to ensure that digital products and services are designed to be inclusive and
usable by everyone. Universal Design principles can help designers create interfaces that
are intuitive, easy to use, and adaptable to different user needs.

Applying Universal Design in HCI

Applying Universal Design principles in HCI involves a range of strategies and


techniques, including designing for accessibility, creating user-friendly interfaces, and
testing for usability.
Multimodal interfaces

Human interaction with the world is inherently multimodal (Bunt et al., 1998, Quek et al.,
2002). We employ multiple senses, both sequentially and in parallel, to passively and
actively explore our environment, to confirm expectations about the world and to
perceive new information. We experience external stimuli through sight, hearing, touch,
and smell, and we sense our internal kinesthetic state through proprioception. A given
sensing modality may be used to simultaneously estimate several useful properties of
one’s environment – for example, audio cues may be used to determine a speaker’s
identity and location, to recognize the speaker’s words and interpret the prosody of the
utterance, to estimate the size and other characteristics of the surrounding physical space,
and to identify other characteristics of the environment and simultaneous peripheral
activities. Multiple sensing modalities give us a wealth of information to support
interaction with the world and with one another.

In stark contrast to human experience with the natural world, human–computer


interaction has historically been focused on unimodal communication – i.e., information
or data communicated between human and computer primarily through a single mode or
channel, such as text on a screen with a keyboard for input. While, technically, almost all
interaction with computers has been multimodal to some degree – combining typed text
with switches, buttons, mouse movement and clicks, and providing various visual and
auditory output signals (including unintentional but useful audio cues such as the sound
of a hard drive being accessed) – for much of interactive computing’s history, the model
of a single primary channel for data input, and perhaps a different primary channel for
data output, has been the norm.

Multimodal interfaces describes interactive systems that seek to leverage natural human
capabilities to communicate via speech, gesture, touch, facial expression, and other
modalities, bringing more sophisticated pattern recognition and classification methods to
human–computer interaction. While these are unlikely to fully displace traditional
desktop and GUI-based interfaces, multimodal interfaces are growing in importance due
to advances in hardware and software, the benefits that they can provide to users, and the
natural fit with the increasingly ubiquitous mobile computing environment (Cutugno et
al., 2012). The goal of research in multimodal interaction is to develop technologies,
interaction methods, and interfaces that remove existing constraints on what is possible in
human–computer interaction, towards the full use of human communication and
interaction capabilities in our interactions. This is an interdisciplinary endeavor that
requires collaboration among computer scientists, engineers, social scientists, linguists,
and many others who bring expertise to bear on understanding the user, the system, and
the interaction.

A history of multimodal interaction

Richard Bolt’s “Put That There” system (Bolt, 1980) is widely regarded as a
groundbreaking demonstration that first communicated the value and opportunity for
multimodal interfaces. Bolt’s group at the MIT Architecture Machine Group (later to
become the Media Lab), built the Media Room, which integrated voice and gesture inputs
to enable a user sitting in a chair to have a rather natural and efficient interaction with a
wall display in the context of a spatial data management system (see Fig. 1

Advantages of multimodal interaction

Multimodal interaction systems aim to support the recognition of naturally occurring


forms of human language and behavior through the use of recognition-based technologies
(Oviatt, 2003, Waibel et al., 1996). Multimodal interfaces are generally intended to
deliver natural and efficient interaction, but it turns out that there are several specific
advantages of multimodality. Although the literature on formal assessment of multimodal
systems is still sparse, various studies have shown that

Input and output modalities

Some of the terms relevant to multimodal interaction – such as modes/modalities,


channels, devices, multisensory, multimedia, and multimodal – have subtly or
significantly different meanings in different communities. Blattner and Glinert (1996)
addressed the terminology years ago; Table 1 updates their list of modalities and
examples. In addition to input modalities listed in the table, emerging technologies such
as indirect sensing of neural activity (e.g., brain–computer interfaces) may

Adaptive Systems

Adaptive systems are a crucial component of Human-Computer Interaction (HCI),


enabling technology to adjust to the needs and behaviors of users. In this article, we'll
explore the definition, significance, and evolution of adaptive systems, as well as their
importance in modern technology.

Definition and Significance of Adaptive Systems in HCI

Adaptive systems are designed to modify their behavior in response to user interactions,
preferences, and environmental factors. This adaptability enables systems to provide
more intuitive and responsive user experiences, improving overall usability and user
satisfaction. The significance of adaptive systems lies in their ability to learn from user
behavior and adjust accordingly, creating a more personalized and engaging interaction.

Importance of Adaptive Systems in Modern Technology

Adaptive systems play a vital role in modern technology, as they enable systems to
respond to user needs in a more effective and efficient manner. By adapting to user
behavior and preferences, adaptive systems can:

 Improve user experience through personalized interactions


 Enhance usability by adjusting to user needs and goals
 Increase user engagement through relevant and timely feedback
 Support users with disabilities by providing tailored assistance

Types of Adaptive Systems

Adaptive systems can be categorized into three primary types: user-adaptive, context-
adaptive, and task-adaptive systems. Each type of adaptive system has its unique
characteristics and applications.

User-Adaptive Systems

User-adaptive systems tailor experiences based on user behavior and preferences. These
systems use various techniques, such as machine learning and user modeling, to learn
about user preferences and adapt accordingly. Examples of user-adaptive systems
include:

Personalized product recommendations on e-commerce websites

Customized news feeds on social media platforms

Adaptive difficulty adjustment in video games

HCI Research Approaches and Frameworks:

Here, approaches and frameworks are defined as having the same scope – that of HCI
research. However, frameworks are more rigorously specified. They are more complete,
coherent and fit for purpose with respect to HCI research than approaches. Included here
are HCI research approaches classified as innovation, art, craft, applied, science and
engineering. Each type of approach has an associated HCI research framework (see 4.3–
9.3). Other types of HCI approach and framework could be created as required. The
creation would follow the same process proposed here, as supported by the textbook’s
research practice assignments, presented at the end of each chapter (see 15.3). The
frameworks to be applied to the approaches have a common basis in a conception for
HCI originally proposed respectively for the HCI discipline (Long and Dowell, 1989) and
for the HCI design problem (Dowell and Long, 1989).

User Support

User support refers to the assistance provided to users to help them overcome difficulties
or challenges they encounter while using a computer system or application. The primary
goal of user support is to empower users to achieve their goals and maximize their
productivity. Effective user support is essential in reducing user frustration, improving
user satisfaction, and increasing overall system usability.

Types of User Support: Proactive and Reactive

There are two primary types of user support: proactive and reactive.
 Proactive User Support: Proactive user support involves anticipating and addressing
user needs before they become issues. This type of support includes providing users with
relevant information, guidance, and resources to help them navigate the system or
application. Examples of proactive user support include:
 Contextual help and tooltips
 User documentation and guides
 In-app tutorials and walkthroughs

 Reactive User Support: Reactive user support involves responding to


user queries and issues as they arise. This type of support includes
providing assistance through various channels, such as phone, email,
chat, or in-person support. Examples of reactive user support include:
 Helpdesk support
 Live chat support
 Phone support

You might also like