Thanks to visit codestin.com
Credit goes to github.com

Skip to content
/ DKT2 Public

Revisiting Applicable and Comprehensive Knowledge Tracing in Large-Scale Data

License

Notifications You must be signed in to change notification settings

zyy-2001/DKT2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 

Repository files navigation

🚀Revisiting Applicable and Comprehensive Knowledge Tracing in Large-Scale Data (ECML-PKDD 2025)

PyTorch implementation of DKT2.

arXiv License GitHub Repo stars

🌟Data and Data Preprocessing

Place the Assist17, EdNet, and Comp source files in the dataset directory, and process the data using the following commands respectively:

python preprocess_data.py --data_name assistments17
python preprocess_data.py --data_name ednet
python preprocess_data.py --data_name comp

You can also download the dataset from the link and unzip it in the current directory.

The statistics of the three datasets after processing are as follows:

Datasets #students #questions #concepts #interactions
Assist17 1,708 3,162 411 934,638
EdNet 20,000 12,215 1,781 2,709,132
Comp 45,180 8,392 472 6,072,632

➡️Quick Start

Installation

Git clone this repository and create conda environment:

conda create -n dkt2 python=3.11
conda activate dkt2
pip install -r requirements.txt 
conda create -n mamba4kt python=3.11
conda activate mamba4kt
pip install -r requirements.txt 

It's important to note that xLSTM and Mamba require different CUDA versions, so it's necessary to install two separate Conda virtual environments. At the same time, please strictly follow the installation instructions for xLSTM and Mamba as provided in their respective GitHub repositories. Downloading the correct CUDA packages is crucial.

Training & Testing

You can execute it directly using the following commands:

  • One-step Prediction
CUDA_VISIBLE_DEVICES=0 python main.py --model_name dkt2 --data_name assistments17
CUDA_VISIBLE_DEVICES=0 python main.py --model_name akt --data_name assistments17 --trans True
  • Multi-step Prediction
CUDA_VISIBLE_DEVICES=0 python main.py --model_name dkt2 --data_name assistments17 --len 5
CUDA_VISIBLE_DEVICES=0 python main.py --model_name akt --data_name assistments17 --trans True --len 5
  • Varying-history-length Prediction
CUDA_VISIBLE_DEVICES=0 python main.py --model_name dkt2 --data_name assistments17 --seq_len 500
CUDA_VISIBLE_DEVICES=0 python main.py --model_name akt --data_name assistments17 --trans True --seq_len 500
  • Different Input Settings
CUDA_VISIBLE_DEVICES=0 python main.py --model_name akt --data_name assistments17 --len 5 --mask_future (△ setting)
CUDA_VISIBLE_DEVICES=0 python main.py --model_name akt --data_name assistments17 --len 5 --mask_response (◦ setting)
CUDA_VISIBLE_DEVICES=0 python main.py --model_name akt --data_name assistments17 --len 5 --pred_last (• setting)
  • Multi-concept Prediction
CUDA_VISIBLE_DEVICES=0 python main.py --model_name dkt2 --data_name assistments17 --joint True
CUDA_VISIBLE_DEVICES=0 python main.py --model_name akt --data_name assistments17 --trans True --joint True

⚠️Citation

If you find our work valuable, we would appreciate your citation:

@article{zhou2025revisiting,
  title={Revisiting Applicable and Comprehensive Knowledge Tracing in Large-Scale Data},
  author={Zhou, Yiyun and Han, Wenkang and Chen, Jingyuan},
  journal={arXiv preprint arXiv:2501.14256},
  year={2025}
}

About

Revisiting Applicable and Comprehensive Knowledge Tracing in Large-Scale Data

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages