Official code release for PoissonNet: A Local-Global Approach for Learning on Surfaces, by Arman Maesumi, Tanish Makadia, Thibault Groueix, Vova G. Kim, Daniel Ritchie, and Noam Aigerman, published in ACM Transactions on Graphics (Proceedings of SIGGRAPH Asia 2025).
First install PyTorch, any modern version should work -- our code was validated on two configurations: pytorch==2.1.0, python==3.9, CUDA 11.8, and pytorch==2.8.0, python==3.12, CUDA 12.8.
Then install the remaining dependencies:
pip install pyrender cholespy libigl einops scipy matplotlib tqdm trimesh pillow panopti
PoissonNet uses custom CUDA kernels for on-the-fly construction of mesh operators, you can install our kernels as below. Ensure PyTorch is already installed before this step:
git clone https://github.com/ArmanMaesumi/torch_mesh_ops
cd torch_mesh_ops
python setup.py install
torch_mesh_ops source directly into the PoissonNet codebase, it may cause import issues.
torch_mesh_ops, the version of your locally installed CUDA Toolkit (the nvcc compiler) must match the CUDA version PyTorch was built with (see torch.version.cuda)
See https://github.com/ArmanMaesumi/torch_mesh_ops for more info.
If you prefer uv as your package manager, consider installing like so (change the Torch CUDA version as needed):
uv venv -p 3.12 .venv
source .venv/bin/activate
uv pip install torch torchvision --index-url https://download.pytorch.org/whl/cu128
uv pip install pyrender libigl cholespy einops scipy matplotlib tqdm trimesh pillow panopti 'smplx[all]'
uv pip install --no-build-isolation git+https://github.com/ArmanMaesumi/torch_mesh_opsWe provide training scripts, hyperparameters, and pretrained weights for all of our relevant experiments under experiments/. For preparing datasets, please refer to the respective README.md files located in these experiment directories.
For example, once datasets are prepared, you may start training by running the following from the project root directory:
python -m experiments.<experiment_name>.trainerIntermediate visualizations and checkpoints will be saved in the results/ directory.
We provide an interactive viewer that lets you cycle through pose variations of a handful of characters. Interactive demos are created using Panopti, which works even in remote compute setups (e.g. through SSH). With Panopti installed, simply launch a server using:
python -m panopti.run_server --host localhost --port 8080Then from the project root you can run our demo script in a separate terminal:
python -m experiments.reposing.testThe viewer will automatically load all obj meshes located in demo_meshes/, and allow you to cycle through several SMPL-X poses contained in demo_meshes/example_poses.npy as well as change meshes on the fly.
If you find our work useful in your research, please consider citing:
@article{maesumi2025poissonnet,
author = {Maesumi, Arman and Makadia, Tanish and Groueix, Thibault and Kim, Vladimir G. and Ritchie, Daniel and Aigerman, Noam},
title = {PoissonNet: A Local-Global Approach for Learning on Surfaces},
year = {2025},
booktitle = {ACM Transactions on Graphics (Proceedings of SIGGRAPH Asia 2025)},
publisher = {Association for Computing Machinery}
}