Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
73 views47 pages

E-Assessment & Learning Analytics: 12. Standards & Privacy

The document discusses standards and privacy considerations related to learning analytics. It covers several key points: 1) It describes common components of a learning analytics infrastructure including data generators/collectors, data warehouses, analytics algorithms, and results delivery. 2) It discusses learning data standards like xAPI, LCDM, CAM, and IMS Caliper which define data models and structures for learning data. 3) It highlights the importance of privacy and ethics when working with learner data in learning analytics applications.

Uploaded by

fabian plett
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views47 pages

E-Assessment & Learning Analytics: 12. Standards & Privacy

The document discusses standards and privacy considerations related to learning analytics. It covers several key points: 1) It describes common components of a learning analytics infrastructure including data generators/collectors, data warehouses, analytics algorithms, and results delivery. 2) It discusses learning data standards like xAPI, LCDM, CAM, and IMS Caliper which define data models and structures for learning data. 3) It highlights the importance of privacy and ethics when working with learner data in learning analytics applications.

Uploaded by

fabian plett
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

E-Assessment & Learning Analytics

12. Standards & Privacy

Prof. Dr. Sven Strickroth

SS 2021
Learning Objectives

You
• know the components of the LA infrastructure
• know the concept of learning data warehouses
• know different standards for learning data and their focus
• know different ways to process the data in data warehouses
• understand the main aspects about privacy and ethics in
the context of Learning Analytics.
• can take Privacy Guidelines into consideration.

2
Learning Analytics Infrastructure
Motivation: Current situation

Observation
• Today‘s learners… MOOCs

Social Networks
– … use multiple tools and Learning Management
Systems
resources to learn
– … learn self-regulated Blogs

Content Platforms Learning Apps

Conclusion
• Learning Analytics on a single platform (e.g. Moodle)…
– … cannot capture all relevant learning
– … leads to inaccurate information about learners
– … cannot serve as a reliable basis for feedback and adaptation
mechanisms

4
Infrastructure components

• Learning data generators and collectors

• Learning data warehouse(s)


• Data models and data standards

• Analytics algorithms and data analysis and


manipulation engines
• Learning analytics results delivery strategies
Already discussed in
(information visualization)
presious lectures
• Learning (analytics) dashboards
• Learning analytics indicators

5
Learning Analytics Ecosystem

Campus
LMS Management

Adaptive Learning
Application
Data
Ware
House
OER
Repository
Dashboard
Learn-
App
Course
Platform

6
Learning data sources and collectors

• Learning data
• Interaction data (LMS, phone, …)
• Assessment performance data
• Historical education data
• Personal data
• Sensor data

• Data collection strategies/methods


• LMS logging methods and services
• Sensor‘s data streams
• RFID sensors, location, movement, gesture, audience response, cameras, …
• Mobile monitoring apps
• API endpoints/Web services that deliver
• Historical education data
• Personal data

7
Learning data warehouse

• Data warehouse components


• Underlying data model (Learning data standards)
• Raw data (learner data)
• APIs for standardized read/write access
• REST CRUD methods
• Security

• Properties of a data warehouse


• Scalability, security, performance, …

• Results/Analytics dataExamples for data warehouses:


• Learning Locker
• Learning Records Store (LRS)

8
Processing and analyzing data

Batch processing Real time (data-stream processing)


• Processing large volume of data at once • Time critical operations
– Data is previously collected and – Results in real-time
stored
• Process data as it arrives (asynchronous)
• Cost efficient
• Immediate response/results
• Divide and conquer strategy
• Expensive, complex, and tedious
– MapReduce paradigm
• Takes time

→ Batch sufficient for almost all


interesting indicators

9
Learning analytics infrastructure

10
(Learning) data standards

• xAPI data standard


• based on Activity Streams
• Learning Context Data Model (LCDM)
• Contextualized Attention Metadata (CAM)
• IMS Caliper

• Proprietary/Custom data standards


• Sometimes the above-mentioned are overkill
• Only anonymous data
• Custom Entity-Relationship (ER) diagram
• Column-based storage based on indices

11
Experience API (xAPI or Tin can API/TCAPI)

Experience API (xAPI)


• Specification for transmitting learner actions (Activity Statements)
• Defined by ADL Initiative (“Advanced Distributed Learning”)
• Three main parts: “Brittany (Actor) viewed (Verb) Recording1 (Object)”
• Extensible with context information (e.g., timestamp, course id, duration)

Learning Record Store (LRS)


• REST API (CRUD) using JSON notation
• Database model validating and
storing activity statements
• Dashboard functionality

12
LRS Statement Example (JSON Object)

{ xAPI statements builder (for testing


"timestamp": "2021-01-04T08:45:00+01:00", purposes):
"actor": { – http://adlnet.github.io/xapi-lab/
"mbox": "mailto:[email protected]", Full specification and documentation:
"objectType": "Agent" – https://github.com/adlnet/xAPI-
}, Spec/blob/master/xAPI-
"verb": { About.md#partone
"id": "http://activitystrea.ms/schema/1.0/watch",
"display": {
“en-US": “watched"
}
},
"object": {
"id": "https://www.youtube.com/watch?v=KxYewO9iAgw",
"objectType": "Activity"
}
}
13
xAPI Vocabulary (Ehlenz et al,. 2019)

Verbs are referenced using an Internationalized Resource Identifier (IRI)


• unique identifier
• unlike URIs, IRIs can contain some characters outside of the ASCII character set in
order to support international languages
• IRIs always include a scheme

General problems:
• exchange of the verbs → Repositories/Registries (e.g.,
https://registry.tincanapi.com/)
• Who can contribute?
• Fragmentation
• How to merge data with different verbs?
• IRI unique identifies a verb but different IRIs might be the same activity
• http://activitystrea.ms/schema/1.0/watch vs. http://myxapischema.lmu.de/watch?
• inadequate description of the verbs
• no tracking of changes
• …

14
Learning Context Data Model

User centered events with focus on the context


Particularly suitable for mobile events/learning
REST API for the management of events
• CRUD methods for manipulation of events
• Sharing events and data
LCDM API documentation
• http://www.learning-context.de/text/19/APIDoc
• http://docs.learning-context.de/

Last updated in 2015

15
Contextualized Attention Metadata (CAM)

Data about users’ foci of attention and activities


Generation of context specific user profile
Annotate data objects
• Information about users and usage

CAM Description
• https://www.fit.fraunhofer.de/content/dam/fit/de/documents/02_Contextualiz
ed-Attention-Metadata.pdf

16
IMS Caliper: Structure

Three main components


• IMS Caliper event
– Relationship between actor and object
• IMS Caliper entity
– Participation objects
• IMS Caliper metric profiles
– For each activity domain

IMS Caliper complete specification


• https://www.imsglobal.org/sites/default/files/caliper/v1p1/caliper-spec-
v1p1/caliper-spec-v1p1.html
• https://www.imsglobal.org/caliper-11-metric-profiles

17
IMS Caliper: Event structure

Actor, action, object

Example: a view event

18
IMS Caliper: Events

• AnnotationEvent
• AssessmentEvent
• AssessmentItemEvent
• AssignableEvent
• ForumEvent
• GradeEvent
• MediaEvent
• MessageEvent
• NavigationEvent
• SessionEvent
• ThreadEvent
• ToolUseEvent
• ViewEvent

19
IMS Caliper: Entities

20
IMS Caliper: Metric Profiles

Vocabulary restrictions for each profile


• supported events
• supported actors
• supported actions
• supported objects
• supported generated entities
• supported target entities
• other requirements

21
How to choose a suitable data model (data standard)?

Depends on
• your learning scenarios
• your usage scenarios
• your analytics scenarios

• the nature of the data


• the available/used learning data providers
• your experiences/expertise

22
Privacy Considerations &
Trusted Learning Analytics
Privacy

• Delicate issue
• Privacy in the Internet era

• Government laws
• GDPR

• Practical guidelines

24
Dilemmas

• Environment
• Contextual integrity
• Appropriateness
• Role of power
• impact of surveillance
• Transparency
• Data Ownership
• Students’ Privacy

25
Enviroment: Privacy on the internet

• Easy to collect personal information


• Every interaction triggers log data
• Vast possibilities to collect personal information

• Privacy as currency
• Give something, to get something

26
What do the big players collect?

Google: Microsoft Windows 10:


• Device information
• Log information
• Location information
• Local storage
• Cookies and anonymous
identifiers

https://policies.google.com/privacy

https://docs.microsoft.com/windows/privacy/configure-windows-diagnostic-data-in-your-organization

27
Context

• Appropriateness
• What is collected?
• What is revealed?
• Validity across contexts

• Distribution
• Who sees the data (in terms of roles)
• When?
• For what purpose?

28
Sharing of data in Learning Analytics systems
(Ifenthaler & Schumacher, 2016)

N = 330 valid responses (223 female, 107 male)


100%
90%
80%
70%
60%
50%
40%
30% 68 66 65 68 63
56 51
20% 47 No
32 35
10% Yes
12 9
0%

Students are open to share more data if the learning analytics system
provided rich and meaningful information.
29
GDPR key principles (Article 5)

• Lawfulness, fairness and transparency


• Personal data is processed lawfully, fairly and transparently
• Purpose limitation
• Personal data is collected only for specified purposes (in this case LA)
• Data minimization
• The collected data is limited to what is necessary for those purposes
• Accuracy
• Avoid misinterpretation of data and analytics results
• Storage limitation
• Data is held for no longer than is necessary
• Integrity and confidentiality (safety & security)
• The data is retained securely
• Accountability
• Responsible institutional body for learning analytics (and other e-learning
services)
30
Ethical Principles (Slade & Prinsloo, 2013)

• Learning Analytics as moral practice


• Students at the center
• Data is temporal and dynamic
• Student success is complex and multi dimensional
• Transparency
• Institutions obliged to use data

• Research projects/Outcomes concerning ethics and privacy


• DELICATE (Drachsler & Greller, 2016)
• Code Of Practice
• The Open University: http://www.open.ac.uk/students/charter/essential-
documents/ethical-use-student-data-learning-analytics-policy
• The University of Edinburgh: http://www.ed.ac.uk/information-services/learning-
technology/learning-analytics
• Jisc: https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics
• Verhaltenskodex für Trusted Learning Analytics. Entwurf für die hessischen
Hochschulen: http://nbn-resolving.de/urn:nbn:de:0111-dipfdocs-189038

31
DELICATE checklist (Drachsler & Greller, 2016)

Determination – Why you want to apply LA?


Explain – Be open about your intentions and objectives
Legitimate – Why you are allowed to have the data?
Involve – Involve all stakeholders and the data subjects
Consent – Make a contract with the data subjects
Anonymize – Make the individual not retrievable
Technical – Procedures to guarantee privacy
External – If you work with external providers

32
Practical Principles

• Transparency
• Everything must be consent based (usage does not imply consent)
• IP addresses are also considered to be personal information

• Learners as data owners

• Data governing policies


• Users have the right to be forgotten

• Accountability and assessment


• You can only collect what is adequate, necessary, and not excessive in
relation to Learning Analytics

33
Challenges: Identity Management & Privacy
(Röpke et al, 2017)

• Learners use different identities


• How to map identities and data of a single learner?
• How to hide identities from LRS and other applications?

LMS
Pseudonymity
Provider

Learning Record Store

App
Usage
Analysis Learning
Diary

34
Pseudonym Registration and consent-based tracking

Pseudonym Pseudonymity
Registration Provider

LMS

Data- Collection
LRS

35
Wrap up
Wrap up

37
Recap
Recap I

1. Introduction E-Assessment
• What is (E-)Assessment? Why E-Assessment?
• Characteristics of e-assessments

39
Recap II

2. Formative Assessment
• Audience Response
• Feedback
• Categories by Narciss
• Timing
• …
• Peer Feedback/Assessment
3. Summative Assessment
• Types of E-Exams
• Infrastructures
• Workflow
• BYOD

40
Recap III

4. Item Types & Automatic Grading


• Item Type Characteristics
• Item Types
• Approaches for evaluation & feedback generation
– Strategies
– Integration of external systems
– Security

5. Item Design & Automatic Item Generation


• Blooms taxonomy
• Design of MC items
• Automatic Item Generation
• Model-based approaches
• Generic process
• Using external data sources

41
Recap IV

6. Quality Assurance, Adaptive Assessment, Standards & Architectures


• Quality assurance # correctly answered
𝑝= 𝑋ത𝐶 − 𝑋ത𝑅 𝑝
# students 𝑃𝐵𝑆 =
• Adaptive assessment sd total (1 − 𝑝)
• Standards for E-Assessment
– Question Format
– Tool Interconnection
• Architectures
– Components
– Patterns

7. Introduction to Learning Analytics

42
Recap V

8. Dashboards & Visualization (incl. their Evaluation)

9. Learner Modelling & Recommender Systems (incl. their evaluation)


• Overlay
• Stereotypes
• Perturbation
• Cognitive Theories
• Fuzzy & Bayesian learner models

43
Recap VI

10. Clustering, Classification & Prediction


(incl. their Evaluation)
• Data Mining
• Clustering (Hierarchical Approach, k-means)
• Classification & Prediction (decision trees,
k-nearest neighbors)

11. Social Network & Discourse Analysis


• Inherently social analytics Centrality
– Social Network Analysis
- Metrics (e.g., centrality)
- Community identification
– Discourse Analysis (DA)
- Analysis of meta-data (e.g., thread structures)
- Analysis of content/text

44
Recap VII

12. Standards & Privacy


• Learning Data, data warehouse(s)
• Standards (xAPI, IMS Caliper, …)
• Privacy (Ethics, GDPR,
Trusted Learning Analytics)

45
Prof. Dr. Sven Strickroth
Ludwig-Maximilians-Universität München
Institut für Informatik
Lehr- und Forschungseinheit für
Programmier- und Modellierungssprachen
Oettingenstraße 67
80538 München

Telefon: +49-89-2180-9300
[email protected]

46
References & Further Reading

• Ifenthaler, D., & Schumacher, C. (2016). Student perceptions of


privacy principles for learning analytics. Educational Technology
Research and Development, 64(5), 923-938.
• Verhaltenskodex für Trusted Learning Analytics. Version 1.0.
Entwurf für die hessischen Hochschulen: http://nbn-
resolving.de/urn:nbn:de:0111-dipfdocs-189038
• Drachsler, H., & Greller, W. (2016, April). Privacy and analytics:
it's a DELICATE issue a checklist for trusted learning analytics. In
Proceedings of the sixth international conference on learning
analytics & knowledge (pp. 89-98). ACM.
• Slade, S. & Prinsloo, P. (2013):Learning Analytics Ethical Issues
and Dilemmas. In American Behavioral Scientist 57(10):1510-
1529.
• Röpke et al. (2017). Verwendung pseudonymer Identitäten zur
Unterstützung von Learning Analytics in offenen
Lernumgebungen. In DeLFI-GMW-WS 2017, DeLFI and GMW
workshops 2017. http://ceur-ws.org/Vol-2092/paper1.pdf
47

You might also like