Thanks to visit codestin.com
Credit goes to github.com

Skip to content

cristianpjensen/agd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Adapter Guidance Distillation (📝 Paper)

Teaser images

  • Adapter Guidance Distillation (AGD) folds classifier-free guidance (CFG) into the model using adapters, so each diffusion step needs only one forward pass, effectively doubling inference speed, while in some cases slightly outperforming CFG.
  • The adapters are lightweight and only add 1–5% additional parameters, while keeping the base weights frozen. This also means that they can be disabled to return to the original CFG.
  • It is possible to distill CFG of Stable Diffusion XL (2.6B parameters) on a 24 GB VRAM consumer GPU.
  • AGD is composable with other methods, such as IP-Adapter and ControlNet.
  • No original datasets required—training is done only with samples from the base model.

Usage

See the paper for details on the method and experiments. All commands have a --help flag to show the available options.

Overview of AGD components

Sampling training trajectories

Diffusion Transformer:

python -m agd.dit.sample_trajectories --output-dir /path/to/trajectories

Stable Diffusion 2.1:

python -m agd.sd.sample_trajectories --output-dir /path/to/trajectories --base-model stabilityai/stable-diffusion-2-1 --prompt-file prompts/coco2017_train_subset.txt --inference-steps 999

Stable Diffusion XL:

python -m agd.sd.sample_trajectories --output-dir /path/to/trajectories --base-model stabilityai/stable-diffusion-xl-base-1.0 --prompt-file prompts/coco2017_train_subset.txt --inference-steps 1000

Training adapters

Diffusion Transformer:

python -m agd.dit.train --dir /path/to/results --data-path /path/to/trajectories <...options>

Stable Diffusion 2.1:

python -m agd.sd.train --dir /path/to/results --data-path /path/to/trajectories --base-model stabilityai/stable-diffusion-2-1 <...options>

Stable Diffusion XL:

python -m agd.sd.train --dir /path/to/results --data-path /path/to/trajectories --base-model stabilityai/stable-diffusion-xl-base-1.0 <...options>

Sampling with adapters

Diffusion Transformer:

python -m agd.dit.sample --dir /path/to/results <...options>

Stable Diffusion 2.1/XL:

python -m agd.sd.sample --dir /path/to/results --prompt <prompt> <...options>

Calculating metrics

Diffusion Transformer:

python -m agd.dit.calculate_metrics --dir /path/to/results --ref /path/to/ref_samples <...options>

Stable Diffusion 2.1/XL:

python -m agd.sd.calculate_metrics --dir /path/to/results --ref /path/to/ref_samples <...options>

Acknowledgement

This codebase builds upon the Diffusion Transformer repository and the diffusers library.

Citation

@article{jensen2025efficient,
  title={Efficient Distillation of Classifier-Free Guidance using Adapters},
  author={Jensen, Cristian Perez and Sadat, Seyedmorteza},
  journal={arXiv preprint arXiv:2503.07274},
  year={2025}
}

About

Official codebase for "Efficient Distillation of Classifier-Free Guidance using Adapters."

Resources

License

Stars

Watchers

Forks

Languages