leisaac.mp4
This repository provides teleoperation functionality in IsaacLab using the SO101Leader (LeRobot), including data collection, data conversion, and subsequent policy training.
- 🤖 We use SO101Follower as the robot in IsaacLab and provide relevant teleoperation method.
- 🔄 We offer scripts to convert data from HDF5 format to the LeRobot Dataset.
- 🧠 We utilize simulation-collected data to fine-tune GR00T N1.5 and deploy it on real hardware.
Tip
Welcome to the Lightwheel open-source community!
Join us, contribute, and help shape the future of AI and robotics. For questions or collaboration, contact Zeyu or Yinghao.
First, follow the IsaacLab official installation guide to install IsaacLab. We recommend using Conda for easier environment management. In summary, you only need to run the following command.
# Create and activate environment
conda create -n leisaac python=3.10
conda activate leisaac
# Install cuda-toolkit
conda install -c "nvidia/label/cuda-11.8.0" cuda-toolkit
# Install PyTorch
pip install torch==2.5.1 torchvision==0.20.1 --index-url https://download.pytorch.org/whl/cu118
# Install IsaacSim
pip install --upgrade pip
pip install 'isaacsim[all,extscache]==4.5.0' --extra-index-url https://pypi.nvidia.com
# Install IsaacLab
git clone [email protected]:isaac-sim/IsaacLab.git
sudo apt install cmake build-essential
cd IsaacLab
./isaaclab.sh --installClone this repository and install it as dependency.
git clone https://github.com/LightwheelAI/leisaac.git
cd leisaac
pip install -e source/leisaacpip install pynput pyserial deepdiff feetech-servo-sdkWe provide an example USD asset—a kitchen scene. Please download related scene here and extract it into the assets directory. The directory structure should look like this:
<assets>
├── robots/
│   └── so101_follower.usd
└── scenes/
    └── kitchen_with_orange/
        ├── scene.usd
        ├── assets
        └── objects/
            ├── Orange001
            ├── Orange002
            ├── Orange003
            └── Plate
| Scene Name | Description | Download Link | 
|---|---|---|
| Kitchen with Orange | Example kitchen scene with oranges | Download | 
| Lightwheel Toyroom | Modern room with many toys | Download | 
Tip
For more high-quality scene assets, please visit our official website or the Releases page.
We use the SO101Leader as the teleoperation device. Please follow the official documentation for connection and configuration.
You can run teleoperation tasks with the following script:
python scripts/environments/teleoperation/teleop_se3_agent.py \
    --task=LeIsaac-SO101-PickOrange-v0 \
    --teleop_device=so101leader \
    --port=/dev/ttyACM0 \
    --num_envs=1 \
    --device=cpu \
    --enable_cameras \
    --record \
    --dataset_file=./datasets/dataset.hdf5Parameter Descriptions:
- 
--task: Specify the task environment name to run, e.g.,LeIsaac-SO101-PickOrange-v0.
- 
--teleop_device: Specify the teleoperation device type, e.g.,so101leader,keyboard.
- 
--port: Specify the port of teleoperation device, e.g.,/dev/ttyACM0. only used when teleop_device isso101leader.
- 
--num_envs: Set the number of parallel simulation environments, usually1for teleoperation.
- 
--device: Specify the computation device, such ascpuorcuda(GPU).
- 
--enable_cameras: Enable camera sensors to collect visual data during teleoperation.
- 
--record: Enable data recording; saves teleoperation data to an HDF5 file.
- 
--dataset_file: Path to save the recorded dataset, e.g.,./datasets/record_data.hdf5.
If the calibration file does not exist at the specified cache path, or if you launch with --recalibrate, you will be prompted to calibrate the SO101Leader.  Please refer to the documentation for calibration steps.
After entering the IsaacLab window, press the b key on your keyboard to start teleoperation. You can then use the specified teleop_device to control the robot in the simulation. If you need to reset the environment after completing your operation, simply press the r or n key. r means resetting the environment and marking the task as failed, while n means resetting the environment and marking the task as successful.
Troubleshooting:
If you encounter permission errors like ConnectionError, you may need to run:
sudo chmod 666 /dev/ttyACM0
# or just add your user in related groups
sudo usermod -aG dialout $USERCollected teleoperation data is stored in HDF5 format in the specified directory. We provide a script to convert HDF5 data to the LeRobot Dataset format. Only successful episode will be converted.
Note
This script depends on the LeRobot runtime environment. We recommend using a separate Conda environment for LeRobot—see the official LeRobot repo for installation instructions.
You can modify the parameters in the script and run the following command:
python scripts/convert/isaaclab2lerobot.pyGR00T N1.5 provides a fine-tuning workflow based on the LeRobot Dataset. You can refer to nvidia/gr00t-n1.5-so101-tuning to fine-tune it with your collected lerobot data. We take pick-orange task as an example.
- First, collect a pick-orange dataset in simulation.
- Then, fine-tune GR00T N1.5 using this data.
- Finally, deploy the trained policy on real hardware.
We gratefully acknowledge IsaacLab and LeRobot for their excellent work, from which we have borrowed some code.
We're always looking for talented individuals passionate about AI and robotics! If you're interested in:
- 🤖 Robotics Engineering: Working with cutting-edge robotic systems and teleoperation
- 🧠 AI/ML Research: Developing next-generation AI models for robotics
- 💻 Software Engineering: Building robust, scalable robotics software
- 🔬 Research & Development: Pushing the boundaries of what's possible in robotics
Join us at Lightwheel AI! We offer:
- Competitive compensation and benefits
- Work with state-of-the-art robotics technology
- Collaborative, innovative environment
- Opportunity to shape the future of AI-powered robotics
Apply Now → | Contact Now → | Learn More About Us →
Let's build the future of robotics together! 🤝