Thanks to visit codestin.com
Credit goes to github.com

Skip to content

VSLAM-LAB/VSLAM-LAB

Repository files navigation

VSLAM-LAB Logo VSLAM-LAB

A Comprehensive Framework for Visual SLAM Baselines and Datasets

Alejandro Fontan · Tobias Fischer · Nicolas Marticorena

Somayeh Hussaini · Ted Vanderfeen · Javier Civera · Michael Milford

VSLAM-LAB is designed to simplify the development, evaluation, and application of Visual SLAM (VSLAM) systems. This framework enables users to compile and configure VSLAM systems, download and process datasets, and design, run, and evaluate experiments — all from a single command line!

Why Use VSLAM-LAB?

  • Unified Framework: Streamlines the management of VSLAM systems and datasets.
  • Ease of Use: Run experiments with minimal configuration and single command executions.
  • Broad Compatibility: Supports a wide range of VSLAM systems and datasets.
  • Reproducible Results: Standardized methods for evaluating and analyzing results.

Getting Started

To ensure all dependencies are installed in a reproducible manner, we use the package management tool pixi. If you haven't installed pixi yet, please run the following command in your terminal:

curl -fsSL https://pixi.sh/install.sh | bash

After installation, restart your terminal or source your shell for the changes to take effect. For more details, refer to the pixi documentation.

Clone the repository and navigate to the project directory:

git clone https://github.com/alejandrofontan/VSLAM-LAB.git && cd VSLAM-LAB

Quick Demo

You can now execute any baseline on any sequence from any dataset within VSLAM-LAB using the following command:

pixi run demo <baseline> <dataset> <sequence>

For a full list of available systems and datasets, see the VSLAM-LAB Supported Baselines and Datasets. Example commands:

pixi run demo mast3rslam eth table_3
pixi run demo droidslam euroc MH_01_easy
pixi run demo orbslam2 rgbdtum rgbd_dataset_freiburg1_xyz

To change the paths where VSLAM-LAB-Benchmark or/and VSLAM-LAB-Evaluation data are stored (for example, to /media/${USER}/data), use the following commands:

pixi run set-benchmark-path /media/${USER}/data
pixi run set-evaluation-path /media/${USER}/data

Configure your own experiments

With VSLAM-LAB, you can easily design and configure experiments using a YAML file and run them with a single command. To run the experiment demo, execute the following command:

ARGUMENT="--exp_yaml exp_mono.yaml" pixi run vslamlab

Experiments in VSLAM-LAB are sequences of entries in a YAML file (see example ~/VSLAM-LAB/configs/exp_mono.yaml):

exp_vslamlab:
  Config: config_mono.yaml     # YAML file containing the sequences to be run 
  NumRuns: 1                   # Maximum number of executions per sequence
  Parameters: {verbose: 1}     # Vector with parameters that will be input to the baseline executable 
  Module: droidslam            # droidslam/monogs/orbslam2/mast3rslam/dpvo/...                    

Config files are YAML files containing the list of sequences to be executed in the experiment (see example ~/VSLAM-LAB/configs/config_mono.yaml):

rgbdtum:
  - 'rgbd_dataset_freiburg1_xyz'
hamlyn:
  - 'rectified01'
7scenes:
  - 'chess_seq-01'
eth:
  - 'table_3'
euroc:
  - 'MH_01_easy'
monotum:
  - 'sequence_01'

For a full list of available VSLAM systems and datasets, refer to the section VSLAM-LAB Supported Baselines and Datasets.

Add a new dataset

Datasets in VSLAM-LAB are stored in a folder named VSLAM-LAB-Benchmark, which is created by default in the same parent directory as VSLAM-LAB. If you want to modify the location of your datasets, change the variable VSLAMLAB_BENCHMARK in ~/VSLAM-LAB/utilities.py.

  1. To add a new dataset, structure your dataset as follows:
~/VSLAM-LAB-Benchmark
└── YOUR_DATASET
    └── sequence_01
        ├── rgb
            └── img_01
            └── img_02
            └── ...
        ├── calibration.yaml
        ├── rgb.txt
        └── groundtruth.txt
    └── sequence_02
        ├── ...
    └── ...   
  1. Derive a new class dataset_{your_dataset}.py for your dataset from ~/VSLAM-LAB/Datasets/Dataset_vslamlab.py, and create a corresponding YAML configuration file named dataset_{your_dataset}.yaml.

  2. Include the call for your dataset in function def get_dataset(...) in ~/VSLAM-LAB/Datasets/get_dataset.py

 from Datasets.dataset_{your_dataset} import {YOUR_DATASET}_dataset
    ...
 def get_dataset(dataset_name, benchmark_path)
    ...
    switcher = {
        "rgbdtum": lambda: RGBDTUM_dataset(benchmark_path),
        ...
        "dataset_{your_dataset}": lambda: {YOUR_DATASET}_dataset(benchmark_path),
    }

Add a new Baseline

pixi.toml

kill_all 

header

dependencies

tasks

  git-clone

  execute

Baselines/get_baseline.py

# ADD your imports here
from Baselines.baseline_droidslam import DROIDSLAM_baseline
from Baselines.baseline_droidslam import DROIDSLAM_baseline_dev
...

def get_baseline(baseline_name):
    baseline_name = baseline_name.lower()
    switcher = {
        # ADD your baselines here
        "droidslam": lambda: DROIDSLAM_baseline(),
        "droidslam-dev": lambda: DROIDSLAM_baseline_dev(),
        ...
    }

    return switcher.get(baseline_name, lambda: "Invalid case")()

License

VSLAM-LAB is released under a LICENSE.txt. For a list of code dependencies which are not property of the authors of VSLAM-LAB, please check docs/Dependencies.md.

Citation

If you're using VSLAM-LAB in your research, please cite. If you're specifically using VSLAM systems or datasets that have been included, please cite those as well. We provide a spreadsheet with citation for each dataset and VSLAM system for your convenience.

@article{fontan2025vslam,
  title={VSLAM-LAB: A Comprehensive Framework for Visual SLAM Methods and Datasets},
  author={Fontan, Alejandro and Fischer, Tobias and Civera, Javier and Milford, Michael},
  journal={arXiv preprint arXiv:2504.04457},
  year={2025}
}

Acknowledgements

To awesome-slam-datasets

VSLAM-LAB Supported Baselines and Datasets

We provide a spreadsheet with more detailed information for each baseline and dataset.

Baselines System Sensors License Label Conda Pkg Camera Models
MASt3R-SLAM VSLAM mono CC BY-NC-SA 4.0 mast3rslam Pinhole
DPVO VSLAM mono License dpvo Pinhole
DROID-SLAM VSLAM mono rgbd stereo BSD-3 droidslam Pinhole
ORB-SLAM2 VSLAM mono rgbd stereo NVIDIA pycuvslam Pinhole
PyCuVSLAM VSLAM rgbd GPLv3 orbslam2 Pinhole
MonoGS VSLAM License monogs Pinhole
AnyFeature-VSLAM VSLAM GPLv3 anyfeature Pinhole
DSO VO GPLv3 dso Pinhole
ORB-SLAM3 VSLAM mono-vi GPLv3 orbslam3 Pinhole
OKVIS2 VSLAM mono-vi BSD-3 okvis2 Pinhole
PyCuVSLAM VSLAM NPSL pycuvslam Pinhole
GLOMAP SfM mono BSD-3 glomap Pinhole
COLMAP SfM mono BSD colmap Pinhole
GenSfM SfM BSD gensfm Pinhole
Datasets Data Mode Label Sensors Camera Models
ETH3D SLAM Benchmarks real handheld eth mono rgbd Pinhole
RGB-D SLAM Dataset and Benchmark real handheld rgbdtum mono rgbd Pinhole
The KITTI Vision Benchmark Suite real vehicle kitti mono Pinhole
The EuRoC MAV Dataset real UAV euroc mono,stereo, mono-vi Pinhole
ROVER: A Multiseason Dataset for Visual SLAM real vehicle rover mono rgbd Pinhole
The UT Campus Object Dataset real handheld ut_coda mono Pinhole
The Replica Dataset - iMAP synthetic handheld replica mono rgbd Pinhole
TartanAir: A Dataset to Push the Limits of Visual SLAM synthetic handheld tartanair mono Pinhole
ICL-NUIM RGB-D Benchmark Dataset synthetic handheld nuim mono rgbd Pinhole
Monocular Visual Odometry Dataset real handheld monotum Pinhole
RGB-D Dataset 7-Scenes real handheld 7scenes Pinhole
The Drunkard's Dataset synthetic handheld drunkards Pinhole
Hamlyn Rectified Dataset real handheld hamlyn Pinhole
Underwater caves sonar and vision data set real underwater caves Pinhole
HILTI-OXFORD 2022 real handheld hilti2022 Pinhole

VSLAM-LAB v1.0 Roadmap

Core

  • Build system set up (CMake + options for CUDA/CPU)
  • Docker dev image (CUDA + ROS optional)
  • Pre-commit hooks (clang-format, clang-tidy, black/isort if Python)
  • Licensing & citation (LICENSE + CITATION.cff + BibTeX snippet)
  • Example dataset download script (scripts/get_data.sh)

Datasets

  • KITTI extension to stereo
  • ROVER extension to stereo, mono-vi, stereo-vi
  • TartanAir extension to stereo
  • EuRoC extension to stereo-vi
  • monotum re-implement mono
  • 7scenes re-implement mono, rgbd
  • drunkards re-implement mono, rgbd
  • hamlyn re-implement mono mono
  • caves re-implement mono
  • hilti2022 re-implement mono
  • scannetplusplus re-implement mono
  • ariel re-implement mono
  • lamar implement mono
  • squidle implement mono
  • openloris re-implement mono
  • madmax implement mono, rgbd, stereo, mono-vi, stereo-vi
  • sweetcorals implement mono
  • reefslam implement mono
  • ...

Baselines

  • AnyFeature VSLAM implement mono, rgbd, stereo
  • DSO VSLAM implement mono
  • MonoGS re-implement mono, rgbd
  • VGGT implement SfM
  • ORBSLAM3 implement mono, rgbd, stereo, stereo-vi, rgbd-vi
  • OKVIS2 implement mono, stereo-vi
  • pyCuVSLAM implement mono, rgbd, stereo, mono-vi, stereo-vi

Metrics

  • Include RPE
  • Link metrics with modalities

Tooling

  • Ablation tools
  • ROS

Docs

  • README quickstart (build, run, datasets)
  • Config reference (YAML/TOML)
  • Architecture diagram
  • Contributing guide

Demos

  • Example video/gif of live run

Project Management

  • Define statuses: Backlog → In Progress → Review → Done
  • Convert key items above to sub-issues

About

A Comprehensive Framework for Visual SLAM Systems and Datasets

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages