Implementation of Spectral Attention Networks, a powerful GNN that leverages key principles from spectral graph theory to enable full graph attention.
netscontains the Node, Edge and no LPE architectures implemented with PyTorch.layerscontains the multi-headed attention employed by the Main Graph Transformer implemented in DGL.traincontains methods to train the models.datacontains dataset classes and various methods used in precomputation.configscontains the various parameters used in the ablation and SOTA comparison studies.misccontains scripts from https://github.com/graphdeeplearning/graphtransformer to download datasets and setup environments.scriptscontains scripts to reproduce ablation and SOTA comparison results. Seescripts/reproduce.mdfor details.