E-Assessment & Learning Analytics
7. Introduction to Learning Analytics
Prof. Dr. Sven Strickroth
SS 2021
Group Project
• Next week: June, 16th 12:15
• presentation of current state of the project
• idea refinements
• architecture
• designs
• examples
• …
• ~ 7-10 minutes (15 minutes per group)
2
Motivation
“Education is a sector far behind the curve in taking
advantage of the advances being made in data
science in adjacent sectors of the economy”
(R. Pea / LAW 2014)
Knowledge about learning process helpful for teachers
Educational institutions and policy makers need information as
foundation of decisions
Knowledge about learning process behavior helpful for learners
Knowledge about learning process foundation for many other
services
• Feedback & Assessment
• Adaptive Learning Applications
3
Horizon Report
Yearly panels of experts from higher education (98 experts in 2019)
• What is on the five-year horizon for higher education institutions?
• Which trends and technology developments will drive educational change?
• What are the critical challenges and how can we strategize solutions?
Image Source: Educause
https://library.educause.edu/resources/2019/4/2019-horizon-report
Images Source: https://www.infodocket.com/2019/04/24/educause-releases-2019-horizon-report-higher-education-edition/
4
Outline
Roots of Learning Analytics
• Academic Analytics
• Educational Data Mining
Learning Analytics Process
Learning Analytics Reference Model
5
Learning Objectives
You
• can explain what Learning Analytics are.
• can explain the concepts of Academic/Action Analytics,
Educational Data Mining and Learning Analytics.
• can compare Academic/Action Analytics and Educational
Data Mining to Learning Analytics.
• can explain the Learning Analytics process
• know what the Learning Analytics Reference Model is.
• can explain the four dimensions of the Reference Model
and give examples and challenges for each dimension.
6
What are Learning Analytics?
• Siemens, 2010
• “Learning analytics is the use of intelligent data, learner-produced data, and
analysis models to discover information and social connections, and to
predict and advise on learning.”
• LAK’11 and Society for Learning Analytics Research (SoLAR)
• Learning analytics is “the measurement, collection, analysis and reporting of
data about learners and their contexts, for purposes of understanding and
optimizing learning and the environments in which it occurs.”
→ These definitions share an emphasis on converting educational data
into useful actions to foster learning.
7
Roots of Learning Analytics
The (short) history of learning analytics
since 1983: Conferences and workshops on Intelligent
Tutoring Systems and Artificial Intelligence in education
2008: first conference on Educational Data Mining
(before: satellite workshops)
2009: Journal EDM
2011: first conference on Learning Analytics and Knowledge
2013: GI working group and workshops on Learning Analytics
2014: Journal LA
9
Related research fields
• Web Analytics
RecSys
• Business Intelligence/Analytics Web
Analytics
• Recommender Systems EDM
• Personalized Adaptive Learning BI Action
IV
Research
Academic/Action Educational Data
Analytics Mining Learning Analytics
(2005) (2008) (2011)
10
Academic Analytics (AA)
… is often used to
• Support academic institutions in decision making (e.g. new courses, mentoring,
financial decisions)
• Predict student success (improve student graduation and retention rates)
• Detect “at risk students”, which might drop out of the course or abandon their
studies
… is
• The application of business intelligence tools and practices in higher education
• A combination of QA data with statistical techniques and predictive modeling
“Action Analytics” – move from data to reporting to analysis to action
11
Example: Course Signals
• Project at Purdue University (USA):
• Identifying at-risk students who can then receive attention to avoid
failure in a particular course
• Based on previous academic performance, demographic
data and effort put into the course
• Reports, risk groups, stoplight ratings, intervention emails
Course Signals Explanation
Source: pixabay.com, Author: Clker-Free-Vector-Images (CC0 1.0)
12
Action Research
• Research method to investigate and evaluate one own teaching
How can the needs
How of learners be
effective is better supported?
the course?
What
interactions
are effective?
Is it meeting
the needs of
the students?
How can they
be further
improved?
• Teachers learn more about their teaching
• Monitoring, analysis, awareness, reflection
• Questions, answers, actions for improvement
13
AA vs. LA
Academic Analytics (AA) Learning Analytics (LA)
Needs of academic institutions Needs of learners and teachers
Administration/Organization (e.g. Teaching/Learning (e.g.
decision making, enrollment monitoring, reflection, awareness,
management, student success, at-risk adaptation, personalization,
students) recommendation)
Mainly statistical software Various analytics methods (e.g. data
mining, SNA)
14
Educational Data Mining
• Educational Data Mining (EDM) is concerned with
developing methods to explore the unique types of data that
come from an educational context and using those methods
to better understand students, and the settings which they
learn in
• EDM is the application of data mining techniques to
educational data
15
EDM vs. LA
Educational Data Mining (EDM) Learning Analytics (LA)
Technological focus (data-driven analytics) Pedagogical / educational focus
(learning- focused analytics)
Adaptation / no human in the loop Personalization / human in the loop
(greater focus on automated adaptation (greater focus on informing and
e.g. by the computer) empowering instructors and learners)
Mainly data mining techniques Statistics, visual analytics, dashboards, SNA
16
Focus of the lecture
EDM LA .
17
Learning Analytics Process
Types of Learning Analytics
• Descriptive
• What happened?
• Diagnostic
• Why something happened?
• Predictive
• What is likely to happen?
• Prescriptive
• What should be done?
19
Learning Analytics Process
Data
Collection raw data
Learning questions
Data Analysis
Activity
Adaptation
Action/
Data Aggregation
Intervention
Visualization answers
Reflection &
Reports
Behavioural
Change
20
Learning Analytics Process
Additional data Data cleaning
Refine data Data integration
New attributes Data collection Data aggregation
New indicators/metrics Data transformation
Post-processing and pre- Data reduction
New LA method
processing Data modeling
User identification
Learning
Analytics
Analytics and
action
Mining, Analysis, Visualization, Reports, Monitoring, …
Intervention
Adaptation
21
Learning Analytics Reference Model
Learning Analytics Reference Model (Chatti et al., 2012)
What? Why?
Data, Environments,
Objectives
Context
Learning Analytics
How? Who?
Methods Stakeholders
23
Learning Analytics Reference Model
What kind of data does the system gather, manage, and use for analytics?
What? Why?
Data, Environments,
Objectives
Context
Learning Analytics
How? Who?
Methods Stakeholders
24
What? (Data, Environments, Context)
protected data
Learners’ traces
Explicit actions (e.g. completing assignments, taking exams, prompts)
Data Sources Tacit actions (e.g. social interactions, online activities)
Sensor data (e.g, electrodermal activity, heart beat rate, gaze detection, …)
Context (e.g. location, time, …) through sensors
open data
Centralized Learning management systems (LMS), Student information systems (SIS)
educational Logs of the students’ activities and interaction
Often used in formal learning setting
systems
Distributed Personal learning environment (PLE) concept
Social Web, User-Generated Content (UGC)
learning Massively Open Online Courses (MOOCs)
environments Apps, …
25
What? - Challenges
• [Big] data
• Explosion in the quantity (and sometimes quality) of available and potentially relevant data
(Diebold, 2000)
• Datasets that are too large to be managed or analyzed by typical database software tools
(Manyika et al., 2011)
• Data privacy
• Identity Quelle: pixabay.com,
Autor: OpenClipart-Vectors (CC0 1.0)
• Multimodality
• e.g., Affective Computing, usage context, gestures, …
• Openness
• Fragmentation
• Heterogeneity
• Data Formats (e.g., xAPI, IMS Caliper)
26
Learning Analytics Reference Model
Who is targeted by the analytics?
What? Why?
Data, Environments,
Objectives
Context
Learning Analytics
How? Who?
Methods Stakeholders
27
Who? (Stakeholder Dimensions)
• data subjects
• suppliers of data, normally through their browsing and interaction
behaviour (e.g., learners)
• data clients
• receivers of information gathered from data subjects
• Researchers/System designers
• study and share the experience on the effects of TEL environments
and of integrating learning analytics.
28
Who (Stakeholders II)
National/International Comparison between systems and
standards
Regional
Academic
Analytics
Institutional Predictive modeling, success and failure patterns,
performance of academics, knowledge flow,
Departemental decision making, resource allocation
Teaching Discourse, social networks, conceptual
Course development, learning resources use
Analytics
Personal performance in relation to learning
Learning goals, learning resources, and study habits of
Analytics Personal
other classmates, improve grades & learning
29
Who? - Challenges
• Competences (e.g., IT skills, data evaluation and
interpretation, self-regulation, reflection)
• visualization and interaction
• defining good indicators
• Prevent data misuse (big brother, feedback vs. grading)
• Ethics, privacy, trust, ownership, stewardship (how data is
collected, preserved, secured, and shared)
• Worried about privacy and big brother aspects?
• When do you allow tracking?
30
LA Reference Model
Why does the system analyze the collected educational data?
What? Why?
Data, Environments,
Objectives
Context
Learning Analytics
How? Who?
Methods Stakeholders
31
Why? (Objectives I)
• Tracking student activities
Monitoring • Generating reports to support decision-making
and analysis • Evaluation of the learning process / usefullness of material
• Improving the learning environment
• Predict learner knowledge/performance
Prediction and • Find early indicators for success
intervention • Find early indicators for poor marks or drop-out
• Provide proactive intervention for students
• Tutoring: control with the tutor, focus on the teaching process
Tutoring and
Mentoring • Mentoring: control with the learner, focus on the learning
process
32
Why? (Objectives II)
Assessment and • Support (self-)assessment of efficiency of the learning process
feedback • Getting intelligent feedback
• Tell learners what to do next (system-driven)
Adaptivity
• Adaptation to the needs of each learner
• Personalization (learner-driven)
Personalization and
• Help learners decide what to do next
recommendation
• Recommending learning resources, people, etc.
• Monitor own activities
Reflection and
• Compare data, evaluate past work
awareness
• Improving future experience/performance
33
Example: Self-Regulated Learning
(Zimmermann & Campillo, 2003)
Performance Phase
Self-Control
Forethought Phase Self-Observation Self-Reflection Phase
Task Analysis … Self-Judgment
Goal setting Self-Reaction
… …
Learners use analytics to: (Dyckhoff et al., 2013)
• Monitor their own activities and interactions
• Compare their activity with that of others
• Increase awareness, reflect and self-reflect
• Improve discussion participation
• …
34
Why - Challenges
• Indicators
• Metrics (beyond grades) Quelle: pixabay.com,
Autor: OpenClipart-Vectors (CC0 1.0)
• Formal learning
• Self-regulated learning
• Network learning
• Professional learning
• Informal and lifelong learning
• “Polycontextual” Profiling
• Lifelong learner modeling
35
LA Reference Model
How does the system perform the analysis of the collected data?
What? Why?
Data, Environments,
Objectives
Context
Learning Analytics
How? Who?
Methods Stakeholders
36
How? (Methods)
• Statistics
• Information Visualization
• Data Mining
• Discussion Analytics
• Social Network Analysis discussed
• Learner Modelling separately
• Recommender Systems
• …
37
Statistics
• Providing basic statistics of the learner’s interaction with a system
• Examples:
• time online
• total number of visits
• number of visits per page
• frequency of postings
• percentage of material read
• Examples of statistics tools:
• R
• Scythe
38
Information Visualization
• Visual representation of information
• Help people to understand information
• Techniques:
• Charts
• Scatterplots
• 3D representations
• Maps
• …
more details in
the next lecture…
39
Example: Dashboard
40
Data Mining
• “The process of discovering useful patterns or knowledge from
data sources, e.g., databases, texts, images, the Web”
• Data mining methods
• Supervised learning
• Classification and prediction
• Unsupervised learning
• Clustering
• Association rule mining Data Mining
Predicitive Descriptive
Classification Regression Time Series Prediction Clustering Summarization Association Sequence
Analysis Rules Discovery
Figure based on Gartner. Evolution of data mining, Gartner Group Advanced Technologies and Applications Research Note, 2/1/95.
41
Who? Why?
Learning Analytics
How?
Clusters & Forms of Representations What?
Awareness, Human Adaptation, esp.
Reflection & Intervention Personalization
Behavioral Change
Visualization Signals & Aggregated
& Prediction Data
Reports
42
Who? Why?
Learning Analytics
What? How?
Data & Methods (Hoppe, 2017)
Learning Apps E-Lectures
Ubiquitous Learning Machine-
Learning … human E-Books
Environments Learners
readable Web-based Trainings
log data
Social Media content Learner’s artefacts
…
Process Oriented Network Analysis Content Oriented
• Based on action logs • Based on •Based on artefacts
• Temporal Pattern Social relations • Text Mining
• Process Mining • Network Measures Semantic & Linguistic Analysis, …
Visualization
Learners Signals &
& Machine
profile Prediction
Reports Learning
43
How? - Challenges
• Design
• Usability (e.g., EDM tools for non-expert users)
• Integration in TEL systems
• Performance
• Scalability
• Extensibility
• Exchange of data
• Does it help?
44
Learning Analytics Framework
(Greller & Drachsler, 2014)
45
Stepwise introduction of Learning Analytics
Image taken from Ifenthaler, D. (2016) Presentation at Learntec 46
Wrap up
Wrap up
EDM LA .
48
Next Lecture
• Dashboards & Visualizations
49
Prof. Dr. Sven Strickroth
Ludwig-Maximilians-Universität München
Institut für Informatik
Lehr- und Forschungseinheit für
Programmier- und Modellierungssprachen
Oettingenstraße 67
80538 München
Telefon: +49-89-2180-9300
[email protected]
50
References & Further Reading
• Lang, C., Siemens, G. Wise, A., Gašević, D. (eds., 2017): Handbook of Learning Analytics –
First edition. https://www.solaresearch.org/publications/hla-17/
• Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International
Journal of Technology Enhanced Learning, 4(5-6), 304-317.
• Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2012). A reference model for learning
analytics. International Journal of Technology Enhanced Learning, 4(5-6), 318-331.
• Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for
learning analytics. Educational Technology & Society, 15(3), 42–57.
• Yun, Fortenbacher, Pinkwart, Bisson & Moukayed (2017). A Pilot Study of Emotion Detection
using Sensors in a Learning Context: Towards an Affective Learning Companion. In DeLFI 2017
Workshop Proceedings.
• Zimmerman, B. J., & Campillo, M. (2003). Motivating self-regulated problem solvers. The
psychology of problem solving, 233262.
• Elkina, M., Fortenbacher, A., & Merceron, A. Lehren mit Learning Analytics erste Erfahrungen
mit dem Tool LeMo. In Proceedings DeLFI 2014.
• Hoppe, H. U. (2017). Computational methods for the analysis of learning and knowledge
building communities. Handbook of learning analytics, 23-33.
• Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013). Supporting action
research with learning analytics. In Proceedings of the Third International Conference on
Learning Analytics and Knowledge (pp. 220-229). ACM.
51