Elisabetta Fedele1,2,
Boyang Sun1,
Leonidas Guibas2,
Marc Pollefeys1,3,
Francis Engelmann2
1ETH Zurich,
2Stanford University,
3Microsoft
Code | Paper | Project Page
SuperDec allows to represent arbitrary 3D scenes with a compact and modular set of superquadric primitives.
Clone the repository and set up the environment:
git clone https://github.com/elisabettafedele/superdec.git
cd superdec
# Create and activate virtual environment
python3 -m venv .venv
source .venv/bin/activate
# Install dependencies
pip install -r requirements.txt
pip install -e .
# Build sampler (required for training only)
python setup_sampler.py build_ext --inplaceDownload the checkpoints:
bash scripts/download_checkpoints.shAlternatively, you can download the individual folders using the links below.
| Model | Dataset | Normalized | Link |
|---|---|---|---|
| shapenet | ShapeNet | ❌ | shapenet |
| normalized | ShapeNet | ✅ | normalized |
Note: We use the
shapenetmodel checkpoint to evaluate on ShapeNet and thenormalizedmodel checkpoint to evaluate on objects from generic 3D scenes.
Once downloaded the checkpoints you can run an inference example by doing:
python demo_viser.pyDownload the ShapeNet dataset (73.4 GB):
bash scripts/download_shapenet.shThe dataset will be saved to data/ShapeNet/. After having downloaded ShapeNet and the checkpoints, the following project structure is expected:
superdec/
├── checkpoints/ # Checkpoints storage
│ ├── normalized/ # Checkpoint and config for normalized objects
│ └── shapenet/ # Checkpoint and config for ShapeNet objects
├── data/ # Dataset storage
│ └── ShapeNet/ # ShapeNet dataset
├── examples/ # Inference example
│ └── chair.ply # ShapeNet chair
├── scripts/ # Utility scripts
├── superdec/ # Main package
├── train/ # Training scripts
└── requirements.txt # Dependencies
Generate and visualize results on ShapeNet test set:
bash scripts/run_on_shapenet.sh Note: Saving the .npz file and mesh generation may take time depending on the size of the dataset and of the chosen resolution for the superquadrics, respectively.
If you want to retrain the network yourself you can either opt for single or multi-gpu training as follows.
Single GPU training:
python train/train.py "optimizer.lr=1e-4"Multi-GPU training (4 GPUs):
torchrun --nproc_per_node=4 train/train.pyNote: Weights & Biases is disabled by default but you can activate it in the training config.
We assume you have the .ply files of all the segmented objects in a single folder OBJECTS_SCENE_DIR. Fill required fields in the script, following the given instructions. Now you are ready to run inference by doing:
bash scripts/run_on_scene.sh We use ompl to demo path planning with SuperDec:
# Install omply python bindings
pip install ompl==1.7.0
# Run path planning in a given decomposd scene
python demo_planning.pyYou can adjust the start and goal positions, as well as the collision radius in the script. This will create a .npz dataset of your objects, save the .npz inference file with superquadric parameters, and visualize the results.
Note: Running this demo requires a display interface.
We adapted some codes from some awesome repositories including superquadric_parsing, CuboidAbstractionViaSeg, volumentations, LION, occupancy_networks, and convolutional_occupancy_networks. Thanks for making codes and data public available.
We welcome contributions! Please feel free to submit issues, feature requests, or pull requests. For more specific questions or collaborations, please contact Elisabetta and Boyang.
- Core implementation and visualization
- ShapeNet training and evaluation
- Instance segmentation pipeline
- Path planning
- Grasping
- Superquadric-conditioned image generation