End-to-end autonomous RC car system using vision-based imitation learning on Raspberry Pi 5.
Main project: src/robots/rover/
See src/robots/rover/README.md for complete documentation.
- Project Structure:
PROJECT_STRUCTURE.md- Overview of the codebase - Getting Started:
src/robots/rover/README.md- Complete rover documentation
- Training Guide:
TRAINING_GUIDE.md- How to train ACT policy - Deployment Guide:
DEPLOYMENT.md- Deploy to Raspberry Pi 5 - Training Output:
TRAINING_OUTPUT_GUIDE.md- Understanding training results - Quick Reference:
QUICK_REFERENCE.md- Common commands
- Hardware Options:
docs/EDGE_DEPLOYMENT_PLAN.md- Raspberry Pi 5 + Hailo / Jetson Orin - Quantization Deep Dive:
src/robots/rover/docs/quantization/- Complete quantization workflow
RC Receiver → Arduino UNO → Raspberry Pi 5 → Autonomous Control
(PWM) (30Hz) (Camera + ML)
Hardware:
- RC car with 2-channel receiver
- Arduino UNO R3 (PWM reader)
- Raspberry Pi 5 (data + inference)
- Camera (30fps)
Wiring:
- Brown wire → GND
- Purple wire → Arduino Pin 2 (Steering)
- Black wire → Arduino Pin 3 (Throttle)
- Validate - Run calibration tool to verify signals
- Collect - Record driving episodes (camera + PWM)
- Train - Learn from demonstrations (see below ⭐)
- Deploy - Autonomous control on-device
cd src/robots/rover
python3 src/record/episode_recorder.py --episode-duration 15 --output-dir ./episodes# See TRAINING_GUIDE.md for full details
./start_training.sh # Quick start
# or use tmux for persistent training (see TRAINING_GUIDE.md)# Quantize and deploy
./deploy_to_pi.sh outputs/lerobot_act/best_model.pth mboels@raspberrypi# On Raspberry Pi
cd ~/EDTH2025/Erewhon/src/robots/rover
python3 src/inference/act_inference_quantized.py \
--checkpoint models/best_model_static_quantized.pth \
--camera_id 0 \
--arduino_port /dev/ttyUSB0 \
--control_freq 30📁 Navigate to src/robots/rover/ for full documentation