Thanks to visit codestin.com
Credit goes to github.com

Skip to content

yoterel/attention_chains_code

Repository files navigation

Attention (as Discrete-Time Markov) Chains


Project Page Paper Google Colab

This is the official implementation of Attention (as Discrete-Time Markov) Chains.

Usage

Core Functionality

For a straight-forward implemetation of multi-bounce attention, TokenRank, and lambda-weighting from the paper, see helpers.py.

Demos

We provide a demo for DINOv1/2, CLIP, supervised ViT (from transformers library) in demo.ipynb.

FLUX Visualization

For visualizing attention with FLUX, run:

flux.py flux.yml

You can edit flux.yml for tinkering with the results.

*Note: you must have the libraries imported by flux.py installed in your virtual environment

Todos

  • Basic functionality
  • Visualization demo for FLUX
  • Segmentation demo for FLUX
  • Demo for DINOv1/2, ViT, CLIP
  • Reproduction of experiments

🎓 Citation

If you find our work useful, please consider giving a star ⭐ and a citation.

@article{erel2025attentionasdiscretetimemarkov,
      title = {Attention (as Discrete-Time Markov) Chains},
      author = {Erel, Yotam and D{\"u}nkel, Olaf and Dabral, Rishabh and Golyanik, Vladislav and Theobalt, Christian and Bermano, Amit H.},
      journal = {arXiv preprint arXiv:2507.17657},
      year = {2025}
}

About

Official implementation of Attention (as discrete-time Markov) Chains

Resources

License

Stars

Watchers

Forks

Contributors 2

  •  
  •