ZED Camera Integration with LeRobot: The ZED camera provides 6 modalities for surgical robotics:
- Wrist camera view (U20CAM-1080p)
- ZED left eye RGB
- ZED right eye RGB
- ZED depth map (20-45cm optimized)
- ZED confidence map
- ZED point cloud (optional)
| ZED Left Eye RGB | ZED Right Eye RGB |
| Depth Map (Color-coded distances) | Confidence Map (Depth accuracy) |
This project includes a complete LeRobot development environment with Python 3.10 and all required dependencies.
For active development (recommended):
source setup/activate_lerobot.shThis activates the environment in your current shell - you'll see (.lerobot) in your prompt.
For information display:
./lesurgeon.sh activateThis shows environment information but returns you to your original shell when finished.
Environment Activation:
# Method 1: Direct activation (recommended for development)
source setup/activate_lerobot.sh # Keeps you in activated environment with (.lerobot) prompt
# Method 2: Information display only
./lesurgeon.sh activate # Shows environment info but returns to original shellRobot Operations:
./lesurgeon.sh identify # Identify which arm is leader/follower (setup)
./lesurgeon.sh status # Check robot status
./lesurgeon.sh calibrate # Calibrate robots
./lesurgeon.sh teleoperate # Standard teleoperation
./lesurgeon.sh teleop-cam # Camera-enabled teleoperation (U20CAM-1080p)Data & Machine Learning:
./lesurgeon.sh hf-setup # Setup Hugging Face authentication
./lesurgeon.sh record # Record teleoperation data for ML
./lesurgeon.sh train # Train ML policy on data
./lesurgeon.sh inference # Run trained policy
./lesurgeon.sh help # Show all commandsManual commands:
-
Activate the environment (for development work):
source setup/activate_lerobot.sh # You'll see (.lerobot) in your prompt indicating the environment is active
-
Test the installation:
python -c "import lerobot; print('LeRobot works!')" -
Setup Weights & Biases (if needed):
python setup/setup_wandb.py
-
Start teleoperation:
bash run/teleoperate.sh
Note: The difference between
./lesurgeon.sh activateandsource setup/activate_lerobot.sh:
./lesurgeon.sh activatedisplays environment information but doesn't keep you in the activated environmentsource setup/activate_lerobot.shactually activates the environment in your current shell (recommended for development)
- Python 3.10 virtual environment in
.lerobot/ - LeRobot with all optional dependencies (
lerobot[all]) - System dependencies: ffmpeg, cmake, build tools, robotics libraries
- Weights & Biases integration for experiment tracking
- Development tools: pre-commit, pytest, debugging tools
- setup/ - Environment setup and configuration scripts
activate_lerobot.sh- Environment activation scriptsetup_huggingface.sh- Hugging Face authentication and setupidentify_arms_interactive.sh- Interactive arm identification wizardverify_arm_identification.sh- Verify current arm mappingssetup_wandb.py- Weights & Biases configurationsetup_summary.sh- Environment setup documentation
- run/ - Operational scripts for robot tasks
teleoperate.sh- Standard teleoperation sessionteleoperate_with_camera.sh- Camera-enabled teleoperation (U20CAM-1080p @ 720p)robot_status.sh- Check robot calibration statusrecord_data.sh- Record teleoperation data for ML trainingupload_dataset.sh- Upload datasets to Hugging Face Hubtrain_policy.sh- Train ML policies on recorded datarun_inference.sh- Run trained policy inferencereplay_episodes.sh- Replay recorded episodesvisualize_dataset.sh- Visualize datasets
- config/ - Robot configuration and calibration data
calibration.sh- Robot calibration commandscalibration_backups/- Backup copies of calibration files
- debug/ - Diagnostic and troubleshooting tools
diagnose_motors.py- Motor diagnostic scriptsimple_motor_check.py- Simple motor position checkerzed_experiments/- ZED camera development and experimental scripts
- src/ - Source code and production modules
cameras/- Camera integration modules (ZED SDK, multimodal capture)zed_tests/- ZED camera testing scripts
- docs/ - Documentation and guides
- setup/ - Environment setup scripts (includes ZED SDK installer)
- .lerobot/ - Python virtual environment (ignored by git)
First-time setup - Identify your arms:
./lesurgeon.sh identify # Interactive wizard to identify leader/follower armsOnce your robots are calibrated, you can:
Teleoperation (Control follower with leader arm):
./lesurgeon.sh teleoperate # Standard teleoperation (no camera)
./lesurgeon.sh teleop-cam # Camera-enabled teleoperation with U20CAM-1080p
# OR manually:
bash run/teleoperate.sh # Direct standard command
bash run/teleoperate_with_camera.sh # Direct camera command
Dual robotic arms with ZED stereo depth sensing for surgical training
Data Recording:
./lesurgeon.sh record # Interactive data recording with camera
./lesurgeon.sh record -n 5 -t "Needle picking and passing" # Custom episodes and task
./lesurgeon.sh record -r # Resume previous recording sessionCheck Robot Status:
./lesurgeon.sh status # Check calibration and connection statusUltra-Short Range Surgical Configuration: The ZED 2 camera has been optimized for surgical robotics with ultra-short range depth perception:
- Range: 20-45cm (optimized for surgical workspace)
- Precision: ±44mm surgical-grade accuracy
- Frame Rate: 10.3 FPS real-time processing
- Modalities: RGB stereo (left/right), depth map, confidence map
Surgical workspace depth sensing optimized for 15-45cm range
Live Multi-Modal Display:
# Run real-time 4-view surgical display (RGB left/right, depth, confidence)
python debug/zed_experiments/live_surgical_multimodal.pyZED Teleoperation:
./lesurgeon.sh teleop-zed # ZED multi-modal teleoperation (coming soon)
# OR manually:
bash run/teleoperate_zed_multimodal.sh # ZED-enabled teleoperationZED Testing and Development:
# Test ZED SDK installation
python src/zed_tests/test_zed_sdk_installation.py
# Test multi-modal capture
python src/zed_tests/test_zed_multimodal_display.py
# Ultra-short range configuration
python debug/zed_experiments/zed_ultra_short_range.pySetup (first time only):
./lesurgeon.sh hf-setup # Authenticate with Hugging FaceComplete ML Pipeline:
# 1. Record training data
./lesurgeon.sh record -n 50 -d "lesurgeon-s01" -t "Needle picking and passing"
# 2. Upload dataset to Hugging Face
./lesurgeon.sh upload
# 3. Train a policy
./lesurgeon.sh train -p act
# 4. Run inference with trained policy
./lesurgeon.sh inference
# 5. Replay episodes for verification
./lesurgeon.sh replay -e 0
# 6. Visualize your data
./lesurgeon.sh visualizeDataset Series Recording:
For comprehensive training data, record multiple series of episodes using consistent naming:
./lesurgeon.sh record -d "lesurgeon-s" -n 50 # Series 1: Episodes 1-50Dataset Naming Convention:
- Format:
lesurgeon-s##where##is the zero-padded series number - Task: Uses the default task name "Needle grasping and passing" for consistency
- Episodes: 5 episodes per series (standard for surgical training data)
- Upload: Each dataset is automatically uploaded to Hugging Face after recording
- Benefits:
- Separate datasets are safer (one failure doesn't affect others)
- Easier to manage and upload individually
- Can train on individual series or combine multiple series
- Clear organization for large-scale data collection
Advanced ML Commands:
# Resume training from checkpoint
./lesurgeon.sh train -r
# Train with custom dataset and policy type
./lesurgeon.sh train -d my-dataset -p act -v cuda
# Run inference with teleop fallback
./lesurgeon.sh inference --teleop
# Replay specific episodes
./lesurgeon.sh replay -e 3 -d my-dataset
# Visualize specific episodes
./lesurgeon.sh visualize -e 5 -o my_viz_folderThe stl_files/ directory contains 3D models and G-code files for the robotics hardware.
Why identify arms? The system needs to distinguish between the leader arm (controller) and follower arm (mimic). The identification wizard prevents confusion by letting you physically connect each arm when prompted.
First-time setup:
./lesurgeon.sh identify # Run the interactive identification wizardVerify current setup:
./setup/verify_arm_identification.sh # Check current arm mappingsThe identification process:
- Disconnect both arms
- Connect only the LEADER arm (the one you control) when prompted
- Connect the FOLLOWER arm (the one that mimics) when prompted
- Configuration is automatically saved and tested
Manual identification scripts:
./setup/identify_arms_interactive.sh # Direct script access
./setup/verify_arm_identification.sh # Direct verificationHappy robot learning! 🚀