VR-Robo: A Real-to-Sim-to-Real Framework for Visual Robot Navigation and Locomotion
Shaoting Zhu*, Linzhan Mou*, Derun Li, Baijun Ye, Runhan Huang, Hang Zhao†
RA-L 2025
-
[2025-06-14] We release the real2sim and RL policy training code.
-
[2025-05-11] Our paper is accepted by RA-L 2025.
-
[2025-02-01] Paper released on arXiv.
We use two different environments for the Isaac Lab and the renderer. Please follow the instructions in each directory to finish the installation.
- For Isaac Lab environment, please refer to vrrobo_isaaclab/README.md for installation instructions.
- For the renderer environment, check out vrrobo_renderer/README.md to complete the setup.
To run the demo, you need to first start the render server in the vrrobo_renderer directory:
conda activate vr-robo-renderer
python render_server.pyThen, start a new terminal in the vrrobo_isaaclab directory, you can run the following command to play the demo:
conda activate vr-robo-isaaclab
python scripts/rsl_rl/play_gs.py --task go2_gs_playIf you want to train the model, you can run:
conda activate vr-robo-isaaclab
python scripts/rsl_rl/train_gs.py --task go2_gs --headlessIf you have any question, please feel free to open an issue or e-mail at [email protected] or [email protected].
Our scene reconstruction code is based on 3DGS, 2DGS and PGSR. Many thanks to these excellent projects.
You can find our paper on arXiv.
If you find this code or find the paper useful for your research, please consider citing:
@article{zhu2025vr,
title={VR-Robo: A Real-to-Sim-to-Real Framework for Visual Robot Navigation and Locomotion},
author={Zhu, Shaoting and Mou, Linzhan and Li, Derun and Ye, Baijun and Huang, Runhan and Zhao, Hang},
journal={arXiv preprint arXiv:2502.01536},
year={2025}
}