-
2025/07/16: We provide two handy scripts.
-
2025/06/20: All the code and models are completed.
-
2025/04/19: We collect existing meta-learning methods for driver saliency prediction.
-
2025/02/15: We collect driving accident data with data categories, divided into 4 folds by category.
- DADA: 52 categories. It repartitioned into
DADA-52i. - PSAD: 4 categories. It repartitioned into
PSAD-4i.
- DADA: 52 categories. It repartitioned into
-
2025/01/11: We propose a model in order to
learn to learndriver attention in driving accident scenarios.
⚡Proposed Model 🔁
we propose a framework that combines a one-shot learning strategy with scene-level semantic alignment, enabling the model to achieve human-like attentional perception in previously unseen scenarios. (named MetaDriver)
- A Novel One-Shot Learning Framework for Driver Saliency Prediction. We propose MetaDriver, inspired by how drivers generalize from limited examples. It filters salient regions from support masks to guide attention learning and uses semantic alignment to improve cross scene generalization.
- A Novel One-Shot Learning Framework for Driver Saliency Prediction. To address noisy attention labels and limited semantic overlap, SMF removes irrelevant background while SSA aligns salient features between support and query pairs, significantly boosting our model’s performance.
- Extensive Evaluation on DADA-52i and PSAD-4i with Strong Generalization and SOTA Results. It exceeds 10 baselines on both datasets and different backbones, achieving up to +34.3% CC, +45.2% SIM and –31.2% KLD gains under oracle-support. These results highlight its robustness and backbone-agnostic design.
📖Datasets 🔁
| Dataset | Accident | Fold-0 | Fold-1 | Fold-2 | Fold-3 |
|---|---|---|---|---|---|
| DADA-52i | 52 | [1] a pedestrian crosses the road. ... | [14] there is an object crash. ... | [27] a motorbike is out of control. ... | [40] vehicle changes lane with the same direction to ego-car. ... |
| PSAD-4i | 4 | moving ahead or waiting. | lateral. | oncoming. | turning. |
【note】 For all datasets we will provide our download link with the official link. Please choose according to your needs.
DADA-52i: This dataset we will upload in BaiduYun (please wait). Official web in link.
PSAD-4i: This dataset we will upload in BaiduYun (please wait). Official web in link.
| DADA-52i | PSAD-4i |
|---|---|
MetaDADA/ ├── 1/ │ ├── 001/ │ │ ├── images/ │ │ └── maps/ │ ├── 002/ │ └── ... ├── 2/ │ ├── 001/ │ │ ├── images/ │ │ └── maps/ │ ├── 002/ │ └── ... └── ... |
MetaPSAD/
├── images/
│ ├── 2/
│ │ ├── 0004/
│ │ └── ...
│ ├── 3/
│ └── ...
└── maps/
├── 2/
│ ├── 0004/
│ └── .../
├── 3/
└── ...
|
🛠️Deployment 🔁
- Git clone this repository
git clone https://github.com/zhao-chunyu/MetaDriver.git
cd MetaDriver- Create conda environment
conda create -n MetaDriver python=3.11
conda activate MetaDriver- Install PyTorch 2.5.1+cu121
pip install torch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 --index-url https://download.pytorch.org/whl/cu121- Install requirement
pip install -r utils/requirements.txtBefore training, scripts you might use: {
modify_yaml.py: One-click to modify dataset path. Details }
If you wish to train with our model, please use the command below.
sh scripts/train_MetaDriver.sh [datasaet] [split#] [backbone] [gpu]
dataset: metadada, metapsad.
#: 0, 1, 2, 3.
backbone: resnet50, vgg.
gpu: 0 or other.
Example:
sh scripts/train_MetaDriver.sh metadada split0 resnet50 0We calculate the predicted values and then use python for the prediction.
sh scripts/test_MetaDriver.sh [datasaet] [split#] [backbone] [gpu] Example:
sh scripts/test_MetaDriver.sh metadada split0 resnet50 0After testing, scripts you might use: {
collect_metrics.py: One-click to collect metrics data. Details }
If you want to visualize all the data of a certain dataset directly, you can use the following command.
sh scripts/visual_MetaDriver.sh [datasaet] [split#] [backbone] [gpu] Example:
sh scripts/visual_MetaDriver.sh metadada split0 resnet50 0If you want to run other comparison models, we have supported 5 SOTA models. Details
Supported model
| Model | PFENet (TPMAI'22) | BAM (CVPR'22) | HDMNet (CVPR'23) | AMNet (NIPS'23) | AENet (ECCV'24) |
Then, you can run the following command.
sh scripts/[train_*.sh] [dataset] [split#] [backbone] [gpu]
*: PFENet, BAM, HDMNet, AMNet, AENet.
dataset: metadada, metapsad.
#: 0, 1, 2, 3.
backbone: resnet50, vgg.
gpu: 0 or other.
Example:
sh scripts/train_PFENet.sh metadada split0 resnet50 0Model result
testingand model resultvisualizationare similar to ourMetaDrivermodel.
🚀Demo 🔁
- Compare with SOTA Models
- Case of Unseen Accident Scenario
⭐️Cite 🔁
If you find this repository useful, please use the following BibTeX entry for citation and give us a star⭐.
waiting accepted