Thanks to visit codestin.com
Credit goes to github.com

Skip to content

AAAI 2026: DcMatch: Unsupervised Multi-Shape Matching with Dual-Level Consistency

License

Notifications You must be signed in to change notification settings

YeTianwei/DcMatch

Repository files navigation

DcMatch: Unsupervised Multi-Shape Matching with Dual-Level Consistency

🔎 Abstract

Establishing point-to-point correspondences across multiple 3D shapes is a fundamental problem in computer vision and graphics. In this paper, we introduce DcMatch, a novel unsupervised learning framework for non-rigid multi-shape matching. Unlike existing methods that learn a canonical embedding from a single shape, our approach leverages a shape graph attention network to capture the underlying manifold structure of the entire shape collection. This enables the construction of a more expressive and robust shared latent space, leading to more consistent shape-to-universe correspondences via a universe predictor. Simultaneously, we represent these correspondences in both the spatial and spectral domains and enforce their alignment in the shared universe space through a novel cycle consistency loss. This dual-level consistency fosters more accurate and coherent mappings. Extensive experiments on several challenging benchmarks demonstrate that our method consistently outperforms previous state-of-the-art approaches across diverse multi-shape matching scenarios.

img

⚙️ Installation

# git clone this repository
git clone https://github.com/YeTianwei/DcMatch.git
cd DcMatch

conda create -n DcMatch python=3.8
conda activate DcMatch

# install PyTorch
conda install -c pytorch -c nvidia pytorch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 pytorch-cuda=12.1 -y
# install PyG
pip install torch-geometric==2.6.1 pip install torch-scatter==2.1.2 torch-sparse==0.6.18 torch-cluster==1.6.3 torch-spline-conv==1.2.2 -f https://data.pyg.org/whl/torch-2.1.0+cu121.html
# install pytorch3d
# if installation fails, refer to https://anaconda.org/channels/pytorch3d/packages/pytorch3d/files
pip install pytorch3d==0.7.8
pip install -r requirements.txt

In addition, this code uses python bindings for an implementation of the Discrete Shell Energy.

Please follow the installation instructions from: Thin shell energy

🗂️ Dataset

For training and testing datasets used in this paper, please refer to the ULRSSM repository from Dongliang Cao et al. Please follow the instructions there to download the necessary datasets and place them under ../data/:

├── data
    ├── FAUST_r
    ├── FAUST_a
    ├── SCAPE_r
    ├── SCAPE_a
    ├── SHREC19_r
    ├── TOPKIDS
    ├── SMAL_r
    ├── DT4D_r

We thank the original dataset providers for their contributions to the shape analysis community, and that all credits should go to the the respective authors and contributors.

🔧 Data preparation

For data preprocessing, we provide preprocess.py to compute all things we need. Here is an example for SMAL_r.

python preprocess.py --data_root ../data/SMAL_r/ --no_normalize --n_eig 200

🔥 Train

To train a specific model on a specified dataset.

python train.py --opt options/train/smal.yaml

You can visualize the training process in tensorboard or via wandb.

tensorboard --logdir experiments/

🚀 Test

To test a specific model on a specified dataset.

python test.py --opt options/test/smal.yaml

The qualitative and quantitative results will be saved in results folder.

🎨 Visualization

Make sure to install the latest polyscope to allow headless rendering.

pip uninstall polyscope
pip install git+https://github.com/nmwsharp/polyscope-py.git@102c57f90d8aeb73b869d4dbf2f48f9466e08c00

To visualize the final results.

python visualize.py --opt options/test/smal.yaml

The visualized images will be saved in results folder.

🧠 Pretrained models

You can find all pre-trained models in checkpoints for reproducibility.

🙏 Acknowledgement

The framework implementation is adapted from Hybrid Functional Maps for Crease-Aware Non-Isometric Shape Matching.

The implementation of DiffusionNet is based on the official implementation.

We thank the authors for making their codes publicly available.

📬 Contact

Feel free to send us a email ([email protected]) if you have any question regarding the paper or you find any bugs in the implementation.

🎓 Citations

Please cite our paper when using the code. You can use the following bibtex

@article{ye2025dcmatch,
  title={DcMatch: Unsupervised Multi-Shape Matching with Dual-Level Consistency},
  author={Ye, Tianwei and Ma, Yong and Mei, Xiaoguang},
  journal={arXiv preprint arXiv:2509.01204},
  year={2025}
}

About

AAAI 2026: DcMatch: Unsupervised Multi-Shape Matching with Dual-Level Consistency

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages