Thanks to visit codestin.com
Credit goes to github.com

Skip to content

DataLab-atom/PRL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Test-Agnostic Long-Tailed Recognition

This repository is the official Pytorch implementation of Breaking Long-Tailed Learning Bottlenecks: A Controllable Paradigm with Hypernetwork-Generated Diverse Experts (NeurIPS 2024).

1. Requirements

  • To install requirements:
pip install -r requirements.txt

2. Datasets

(1) Four bechmark datasets

  • Please download these datasets and put them to the /data file.
  • ImageNet-LT and Places-LT can be found at here.
  • iNaturalist data should be the 2018 version from here.
  • CIFAR-100 will be downloaded automatically with the dataloader.
data
├── ImageNet_LT
│   ├── test
│   ├── train
│   └── val
├── CIFAR100
│   └── cifar-100-python
├── Place365
│   ├── data_256
│   ├── test_256
│   └── val_256
└── iNaturalist 
    ├── test2018
    └── train_val2018

(2) Txt files

  • We provide txt files for test-agnostic long-tailed recognition for ImageNet-LT, Places-LT and iNaturalist 2018. CIFAR-100 will be generated automatically with the code.
  • For iNaturalist 2018, please unzip the iNaturalist_train.zip.
data_txt
├── ImageNet_LT
│   ├── ImageNet_LT_backward2.txt
│   ├── ImageNet_LT_backward5.txt
│   ├── ImageNet_LT_backward10.txt
│   ├── ImageNet_LT_backward25.txt
│   ├── ImageNet_LT_backward50.txt
│   ├── ImageNet_LT_forward2.txt
│   ├── ImageNet_LT_forward5.txt
│   ├── ImageNet_LT_forward10.txt
│   ├── ImageNet_LT_forward25.txt
│   ├── ImageNet_LT_forward50.txt
│   ├── ImageNet_LT_test.txt
│   ├── ImageNet_LT_train.txt
│   ├── ImageNet_LT_uniform.txt
│   └── ImageNet_LT_val.txt
├── Places_LT_v2
│   ├── Places_LT_backward2.txt
│   ├── Places_LT_backward5.txt
│   ├── Places_LT_backward10.txt
│   ├── Places_LT_backward25.txt
│   ├── Places_LT_backward50.txt
│   ├── Places_LT_forward2.txt
│   ├── Places_LT_forward5.txt
│   ├── Places_LT_forward10.txt
│   ├── Places_LT_forward25.txt
│   ├── Places_LT_forward50.txt
│   ├── Places_LT_test.txt
│   ├── Places_LT_train.txt
│   ├── Places_LT_uniform.txt
│   └── Places_LT_val.txt
└── iNaturalist18
    ├── iNaturalist18_backward2.txt
    ├── iNaturalist18_backward3.txt
    ├── iNaturalist18_forward2.txt
    ├── iNaturalist18_forward3.txt
    ├── iNaturalist18_train.txt
    ├── iNaturalist18_uniform.txt
    └── iNaturalist18_val.txt 

3. Script

(1) CIFAR100-LT

Training

  • To train the expertise-diverse model, run this command:
python train.py -configs/mixup/standard_training/config_cifar100_ir100_bs-experts.json
  • One can change the imbalance ratio from 100 to 10/50 by changing the config file.

Evaluate

  • To evaluate expertise-diverse model on the uniform test class distribution, run:
python test.py -r checkpoint_path
  • To evaluate expertise-diverse model on agnostic test class distributions, run:
python test_all_cifar.py -r checkpoint_path

Test-time training

  • To test-time train the expertise-diverse model for agnostic test class distributions, run:
python test_training_cifar.py -c configs/mixup/standard_training/config_cifar100_ir100_bs-experts.json -r checkpoint_path
  • One can change the imbalance ratio from 100 to 10/50 by changing the config file.

(2) ImageNet-LT

Training

  • To train the expertise-diverse model, run this command:
python train.py -c configs/mixup/standard_training/config_imagenet_lt_resnet50_bs-experts.json

Evaluate

  • To evaluate expertise-diverse model on the uniform test class distribution, run:
python test.py -r checkpoint_path
  • To evaluate expertise-diverse model on agnostic test class distributions, run:
python test_all_imagenet.py -r checkpoint_path

Test-time training

  • To test-time train the expertise-diverse model for agnostic test class distributions, run:
python test_training_imagenet.py -c configs/mixup/standard_training/config_imagenet_lt_resnet50_bs-experts.json -r checkpoint_path

(3) Places-LT

Training

  • To train the expertise-diverse model, run this command:
python train.py -c configs/config_places_lt_resnet152_sade.json

Evaluate

  • To evaluate expertise-diverse model on the uniform test class distribution, run:
python test_places.py -r checkpoint_path
  • To evaluate expertise-diverse model on agnostic test class distributions, run:
python test_all_places.py -r checkpoint_path

Test-time training

  • To test-time train the expertise-diverse model for agnostic test class distributions, run:
python test_training_places.py -c configs/test_time_places_lt_resnet152_sade.json -r checkpoint_path

(4) iNaturalist 2018

Training

  • To train the expertise-diverse model, run this command:
python train.py -c configs/config_iNaturalist_resnet50_sade.json

Evaluate

  • To evaluate expertise-diverse model on the uniform test class distribution, run:
python test.py -r checkpoint_path
  • To evaluate expertise-diverse model on agnostic test class distributions, run:
python test_all_inat.py -r checkpoint_path

Test-time training

  • To test-time train the expertise-diverse model for agnostic test class distributions, run:
python test_training_inat.py -c configs/test_time_iNaturalist_resnet50_sade.json -r checkpoint_path

4. Citation

If you find our work inspiring or use our codebase in your research, please cite our work.

@article{zhao2024breaking,
  title={Breaking Long-Tailed Learning Bottlenecks: A Controllable Paradigm with Hypernetwork-Generated Diverse Experts},
  author={Zhao, Zhe and Wen, HaiBin and Wang, Zikang and Wang, Pengkun and Wang, Fanfu and Lai, Song and Zhang, Qingfu and Wang, Yang},
  journal={Advances in Neural Information Processing Systems},
  volume={37},
  pages={7493--7520},
  year={2024}
}

5. Acknowledgements

This is a project based on this pytorch template.

The mutli-expert framework are based on RIDE. The data generation of agnostic test class distributions takes references from LADE.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published