Human-Computer
Interaction (HCI)
DECO2500/7250
Dr Chelsea Dobbins
[email protected]
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
08
Analysis
User-Based Evaluations and Data
© The University of Queensland 2024
2
This content is protected and may not be shared, uploaded, or distributed.
In this session…
• Role and key issues of data gathering
• User Evaluations
This content is protected and may not be shared, uploaded, or distributed.
Quantitative and qualitative data analysis
• User-Based Evaluation Methods
System Usability Scale (SUS)
© The University of Queensland 2024
Technology Acceptance Model (TAM)
Time-on-Task
Think Aloud
3
Recap from Last Week
• UX Goals provide concrete targets for your design
that relate back to requirements gathered from
your users
This content is protected and may not be shared, uploaded, or distributed.
They need to be SMART
They keep your design and engineering effort grounded
in what is important for success
• HTA diagrams describe tasks and sub-tasks at a
detailed level
© The University of Queensland 2024
Allow designers to illustrate different potential task
sequences that may occur through an interaction with
a system
• Interaction flow diagrams are beneficial for
communicating user needs
4
This Photo by Unknown Author is licensed under CC BY-SA-NC
User Evaluations (Usability Testing)
• “Usability testing” is often used interchangeably with “user testing”
or “user-based evaluations”
• Testing session consists of:
This content is protected and may not be shared, uploaded, or distributed.
A researcher (called a “facilitator” or a “moderator”) asks a participant to
perform a task(s), usually using one or more specific user interfaces
While the participant completes each task, the researcher observes the
participant’s behaviour and listens for feedback
© The University of Queensland 2024
5
Role of Data Gathering
• Early focus on users and tasks
• Develop specific usability/user
This content is protected and may not be shared, uploaded, or distributed.
experience criteria
• Use empirical measurement
© The University of Queensland 2024
• Iteration
6
Key Data Gathering Issues
• Goals vary from study to study, and a tool that is well
suited to one study might be not at all effective for
another
This content is protected and may not be shared, uploaded, or distributed.
• Setting goals for data gathering
• Identifying potential participants
© The University of Queensland 2024
• Relationship with participants
• Triangulation
• Pilot studies
7
Ethical Treatment of Participants
• Participants have a right to know:
The goals of the study
What they will be asked to do and how long their involvement will last
This content is protected and may not be shared, uploaded, or distributed.
The possible risks (aim for minimal risk)
What will happen to the research findings and who will see them
How their personal information and data will be stored and who has
access (privacy concerns)
That they will not be quoted without their agreement
© The University of Queensland 2024
That they can leave when they wish without hindrance or questioning
That they will be treated politely and with respect…
• Focus:
“We’re understanding the world you work/play/live in”
“The software is being tested, not you!” 8
Data-Gathering Techniques
© The University of Queensland 2024
9
This content is protected and may not be shared, uploaded, or distributed.
User Evaluations
• Can help designers
uncover how users
This content is protected and may not be shared, uploaded, or distributed.
perceive a system before,
during and after
interacting with it
Evaluations often only
© The University of Queensland 2024
•
give you part of the story.
They give you the “what”
but NOT the “why”
10
User Evaluations
Quantitative Qualitative
•Relates to numbers •Descriptive
•Countable/measurable •Relates to words or language
•Fixed values •Subjective – open to interpretation
This content is protected and may not be shared, uploaded, or distributed.
•Source – counting or measuring •Represented as themes, patterns,
•Structured data collection methods and stories
•Source – observations, interviews
•Data can be observed not measured
•Semi-structured or unstructured
data collection methods
© The University of Queensland 2024
User
Evaluations
11
Quantitative and Qualitative
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
12
User–Based Evaluations
• Tests the usability and
functionality of a system
Occurs in collaboration with
This content is protected and may not be shared, uploaded, or distributed.
•
users
• Considered at all stages in the
design lifecycle
© The University of Queensland 2024
• Many evaluation methods
exist, e.g.
System Usability Scale
Think Aloud
Time on Task
13
User–Based Evaluation Tips
• Think in terms of research questions,
data, and methods.
This content is protected and may not be shared, uploaded, or distributed.
What kind of data will answer this
question? Which method gives me that data
with the most value for my time?
• If you know there are problems with
your design, it’s a good thing!
© The University of Queensland 2024
Gives you clear research questions
• Do a practice (pilot) run before
committing to the evaluation protocol
14
User–Based Evaluations
• Only requires a small group of participants
• ~five users can uncover 80% of a system’s usability
This content is protected and may not be shared, uploaded, or distributed.
problems (but depends on the quality of the data)
© The University of Queensland 2024
L. Kantner, "Techniques for managing a usability test," in IEEE Transactions on Professional Communication, vol. 37, no. 3, pp. 143-148, Sept. 1994, doi: 15
10.1109/47.317479.
Data Analysis
• One of the most important components
• Weak analysis produces inaccurate
This content is protected and may not be shared, uploaded, or distributed.
results
• Inaccurate results hinder the
authenticity of the research and make
the findings unusable
© The University of Queensland 2024
• Vital to choose your data analysis
methods carefully to ensure that your
findings are insightful and actionable
• When to use each method depends on the
research questions 16
the mean
• Standard deviation: average deviation from
Simple Quantitative Analysis
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
17
Simple Quantitative Analysis
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
18
Qualitative Analysis
• Many complex and socially based phenomena in HCI
cannot be easily quantified
This content is protected and may not be shared, uploaded, or distributed.
• Qualitative research aims to:
Understand the qualities of a particular technology
How people use it in their lives
© The University of Queensland 2024
How they think about it
How they feel about it
Understand why
Explain data and patterns
19
A. Adams, P. Lunt, and P. Cairns, “A Qualitative Approach to HCI Research,” in Research Methods for Human-Computer Interaction, 2008, pp. 138–157
https://humansofdata.atlan.com/2018/09/qualitative-quantitative-data-analysis-methods/
Qualitative Analysis
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
20
Qualitative Analysis
Content Analysis Narrative Analysis Discourse Analysis
•Analyse documented information •Analyse content such as from •Used to analyse interactions with
•Usually used to analyse responses interviews, observations from the people
from interviewees field, or surveys. •Analyses the social context in which
•Focuses on using the stories and the communication between the
This content is protected and may not be shared, uploaded, or distributed.
experiences shared by people to researcher and the respondent
answer the research questions occurred.
•Looks at the respondent’s day-to-day
environment and uses that
information during analysis
© The University of Queensland 2024
Grounded Theory Thematic Analysis Sentiment Analysis
•Uses the data to explain why a •Used to deduce the meaning behind •The process of detecting positive or
certain phenomenon happened. the words people use negative sentiment in text
•Studies a variety of similar cases in •Discovers repeating themes in text • Focuses on the polarity of a text
different settings and using the data that reveal key insights into data (positive, negative, neutral)
to derive causal explanation •Outcome is a code frame that • Can detect specific feelings and
captures themes in terms of codes, emotions (angry, happy, sad, etc),
also called categories urgency (urgent, not urgent) and even
intentions (interested v. not
interested) 21
User–Based Evaluations
This content is protected and may not be shared, uploaded, or distributed.
EFFECTIVENESS EFFICIENCY SATISFACTION
Whether or not a user is able Assess how quickly and with Evaluates users’ overall
to complete certain tasks what amount of effort users satisfaction with the
with your system are able to complete those experience your system
© The University of Queensland 2024
How effectively they manage tasks. provides.
to complete the task Assesses user perceptions,
preferences, and emotional
responses.
22
Scale (SUS)
System Usability
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
23
System Usability Scale (SUS)
• Measures perceptions of usability –
Brooke (1986)
A “quick and dirty” survey scale
This content is protected and may not be shared, uploaded, or distributed.
•
• Used to provide an overall usability
assessment measurement
Effectiveness — can users successfully
achieve their objectives?
© The University of Queensland 2024
Efficiency — how much effort and resource
is expended in achieving those objectives
Satisfaction — was the experience
satisfactory?
• Generally, technology agnostic and non-
proprietary 24
1
2
3
4
5
Measuring Perceptions of Usability
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
25
Measuring Perceptions of Usability
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
26
Calculating a SUS Score
Odd numbered questions (positively rated): subtract
1 from the user's response
This content is protected and may not be shared, uploaded, or distributed.
Even numbered questions (negatively rated):
subtract user’s response from 5
Add up converted values
© The University of Queensland 2024
Multiply by 2.5
Gives a score out of 100 (percentile rank)
27
Image source: https://measuringu.com/interpret-sus-score/
Interpreting SUS
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
28
•
•
•
50th
rated)
rated)
=< 51 (F
>= 80.3 (A
68 (C rated)
The average
score (at the
percentile) is
Interpreting SUS
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
29
Benefits vs Drawbacks
Benefits Cheap and quick
Very easy to administer
This content is protected and may not be shared, uploaded, or distributed.
An industry standard
Small sample size acceptable (minimum 5 users)
Repeatable
Drawbacks Complex scoring system
© The University of Queensland 2024
Results not a percentage
Not a diagnostic tool
Not to be used in isolation
30
(TAM)
Technology
Acceptance Model
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
31
Technology Acceptance Model (TAM)
• Information systems theory Perceived
usefulness
Perceived
ease-of-use
that illustrates how users (PU) (PEOU)
This content is protected and may not be shared, uploaded, or distributed.
come to accept and use a
technology (Davis, 1989) Degree to
which a person
Degree to
which a person
believes that
believes that
using a
using a
particular
particular
system would
© The University of Queensland 2024
system would
enhance his or
be free from
her job
effort
performance
32
Davis, F. D. (1989), "Perceived usefulness, perceived ease of use, and user acceptance of information technology", MIS Quarterly, 13 (3): 319–340, doi:10.2307/249008, JSTOR 249008
Technology Acceptance Model (TAM) from Davis, 1989.
Technology Acceptance Model (TAM)
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
33
Technology Acceptance Model (TAM)
Dimension Question
PU1 I can accomplish my [description of task] more quickly using [name of system]
PU2 I can accomplish my [description of task] more easily using [name of system]
PU3 [Name of system] enhances my effectiveness in utilizing [type of service]
This content is protected and may not be shared, uploaded, or distributed.
PU4 [Name of system] enhances my efficiency in utilizing [type of service]
PU5 [Name of system] enables me to make better decisions in utilizing [type of service]
PU6 Overall, I find [name of system] useful
PEOU1 Learning to use [name of system] is easy for me
PEOU2 It is easy to use [name of system] to accomplish my [task]
© The University of Queensland 2024
PEOU3 Overall, I believe [name of system] is easy to use
ATT1 In my opinion, it is desirable to use [name of system]
ATT2 I think it is good for me to use [name of system]
ATT3 Overall, my attitude towards [name of system] is favourable
ITO1 I will use [name of system] on a regular basis in the future
ITO2 I will frequently use [name of system] in the future
ITO3 I will strongly recommend others to use [name of system] 34
Technology Acceptance Model (TAM)
• Items are scored on a 7-point Likert scale
• Statistics are then generated for each dimension
This content is protected and may not be shared, uploaded, or distributed.
Average
Standard Deviation
© The University of Queensland 2024
1 2 3 4 5 6 7
35
TAM 3
gy-acceptance-model-tam
TAM 2 and
https://acceptancelab.com/technolo
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
36
Benefits vs Drawbacks
Benefits TAM 1 Easy to understand
Has demonstrated a high level of predictiveness in many contexts
This content is protected and may not be shared, uploaded, or distributed.
Takes external and social influences into consideration as well
TAM 2 and 3
Both models have been successfully applied to a wide variety of
innovations
Drawbacks TAM 1 Originally developed for the adoption of IT at the workplace
© The University of Queensland 2024
Neglects the diverse needs of users, including subjective norms or social
impact
The central constructs (PU and PEOU) provide no information about how
to make technology more useful and easier to use
Very complex due to the multitude of factors incorporated
TAM 2 and 3
37
Time-on-Task
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
38
Time-on-Task
• Simple metric to track how fast
and efficient users perform
specific tasks with your system
This content is protected and may not be shared, uploaded, or distributed.
• Easy and understandable (making it
used often)
• Used to identify usability problems
© The University of Queensland 2024
in goal-directed tasks
High-level task, such as finding a product
on an online shopping site
Low-level task, such as a pointing task
modelled by Fitts' Law
39
This Photo by Unknown Author is licensed under CC BY-NC
Time-on-Task
• There’s no perfect time for this metric
Always depends on the task and its
complexity
This content is protected and may not be shared, uploaded, or distributed.
• Benchmark with one expert
• Three “events” to measure:
Average task completion time
Only users who completed the task successfully
© The University of Queensland 2024
(most common)
Mean time to failure
Average time users are spending on the task before
they give up or complete the task incorrectly
Average time on task
Total duration users are spending on your task.
40
This Photo by Unknown Author is licensed under CC BY-NC
Time-on-Task
This content is protected and may not be shared, uploaded, or distributed.
1 2 3 4 5
Define the specific Set clear start and Use tracking tools Analyse the data, Refine the task or
© The University of Queensland 2024
task users must end points for the or software to considering UI based on
complete. task. measure the time outliers or findings to
taken. disruptions. optimize the time.
41
Time-on-Task Analysis
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
42
Time-on-Task Analysis
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
43
Benefits vs Drawbacks
Objective Measurement: Quantifiable metric to assess efficiency.
Benefits
Performance Benchmarking: Enables comparison of different design iterations or competitor products.
User Efficiency: Helps identify how quickly users can achieve their goals.
This content is protected and may not be shared, uploaded, or distributed.
Optimizing Workflows: Pinpoints areas of the design causing delays or confusion.
Identifying Pain Points: Long task times can highlight areas needing redesign or rethinking.
Supporting Design Decisions: Data-backed evidence to justify design changes.
Drawbacks Doesn’t consider diversity of people and often requires multiple rounds of testing
© The University of Queensland 2024
Average task-times are an inaccurate metric when users think-out-loud.
Task times are only for benchmarking and not for identifying problems.
44
Think Aloud
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
45
Think Aloud
• Users verbalize their thoughts,
feelings, and opinions while
This content is protected and may not be shared, uploaded, or distributed.
interacting with a system
• Useful for capturing a wide
range of cognitive activities
© The University of Queensland 2024
• Two variations exist:
Specific task
Open-ended
46
What Happens When Someone…
• Goes quiet while they use the prototype?
Ask them small reminder questions like “what were you thinking about when you
chose that option?” for example. You can ask “Tell me what you’re thinking” but
don’t come across as pushy.
This content is protected and may not be shared, uploaded, or distributed.
• Does something unexpected/wrong?
Let it happen and let them explain what they were thinking the entire time they got
it wrong. If they get stuck or can’t correct their mistake, step in to help, but ask
about what happened.
• Starts giving you UI design suggestions?
© The University of Queensland 2024
Ask them why they think their suggestion would work for them, don’t just take
suggestions at face value.
• Forgets what they wanted to say, and move on?
There’s another type of think aloud that happens with recordings after a participant
has done a task. In a similar way you could ask them to think about what they did
and try to remember what they were thinking at the time.
47
Think Aloud Considerations
This content is protected and may not be shared, uploaded, or distributed.
© The University of Queensland 2024
Recruit representative Give them representative Be quiet and let the
users. tasks to perform. users do the talking.
48
Think Aloud Considerations
This content is protected and may not be shared, uploaded, or distributed.
© The University of Queensland 2024
Clemmensen, Torkil, Morten Hertzum, Kasper Hornbæk, Qingxin Shi and Pradeep G. Yammiyavar.
“Cultural Cognition in the Thinking-Aloud Method for Usability Evaluation.” ICIS (2008). 49
Benefits vs Drawbacks
Useful for understanding how the user approaches the interface
Benefits
Rapid, high-quality, qualitative feedback
Broad range of detailed data
This content is protected and may not be shared, uploaded, or distributed.
Errors can be clarified
Cheap, robust and flexible
Meaningful dialogue
Versatile
Usually, the majority of major issues can be found
Drawbacks Unnatural situation makes it difficult for participants to keep up the required monologue
© The University of Queensland 2024
Small sample size, so can be difficult to know the relative importance of problems identified
Talking aloud changes the time spent on tasks
Filtered statements (vs. brain dump) as users are supposed to say things as soon as they come
to mind rather than reflect on their experience
Biasing user behavior. Prompts and clarifying questions are usually necessary, but from an
untrained facilitator, such interruptions can very easily change user behavior. 50
Presenting Findings
• Only make claims that your data
can support
This content is protected and may not be shared, uploaded, or distributed.
• Dependant on the audience,
purpose, data gathering, and
analysis undertaken
© The University of Queensland 2024
• Graphical representations are
always powerful
51
Presenting Findings
Questionnaire with 5-
point Likert items on it
(strongly disagree to
strongly agree)
This content is protected and may not be shared, uploaded, or distributed.
Choose a graph more
suitable for your data
© The University of Queensland 2024
Collation of
results from six
participants
(P01–P06). Some ways of viewing data:
- Raw numbers (1–5 ratings)
- Trends (means, SDs)
- Participant patterns 52
© The University of Queensland 2024
This content is protected and may not be shared, uploaded, or distributed.
53
Current Bachelor of
Information Technology
Student Survey 2024
What is the benefit of taking the survey?
This is your opportunity to provide feedback to help shape the future of
Information Technology at UQ
We use your responses to improve the program and teaching.
Responses and feedback are confidential.
Complete the survey by 3rd May 2024 to enter a draw to win one of 5
Coles/Myer gift cards valued at $100 (must be a current student
enrolled in one of the BInfTech or dual degree programs).
How do I complete the survey?
1. Visit
https://uniofqueensland.syd1.qualtrics.com/jfe/form/SV_5zoiVmc
XhDtp9dQ or scan the QR code.
2. Constructive feedback is very helpful. Please be respectful and
specific – provide examples.
3. © The
Complete before2024
University of Queensland 3rd This
May 2024
content is protected and may
54
not be shared, uploaded, or distributed.
Summary
• There are many complex and socially-based
experiences in HCI that cannot be easily quantified
This content is protected and may not be shared, uploaded, or distributed.
or experimentally manipulated
• There are many evaluations methods for usability,
each uses a different approach and theory
© The University of Queensland 2024
• Need to understand the theory behind an approach to
understand what is being measured
• Data analysis depends on the data gathered
55
Looking ahead…
• In our next session, we will look at Evaluating
Usability: “Expert” or “Non-User” Evaluations
This content is protected and may not be shared, uploaded, or distributed.
© The University of Queensland 2024
56