This repository holds the code for controlling LeLamp. The runtime provides a comprehensive control system for the robotic lamp, including motor control, recording/replay functionality, voice interaction, and testing capabilities.
LeLamp is an open source robot lamp based on Apple's Elegnt, made by [Human Computer Lab]
LeLamp Runtime is a Python-based control system that interfaces with the hardware components of LeLamp including:
- Servo motors for articulated movement
- Audio system (microphone and speaker)
- RGB LED lighting
- Camera system
- Voice interaction capabilities
lelamp_runtime/
├── main.py # Main runtime entry point
├── pyproject.toml # Project configuration and dependencies
├── lelamp/ # Core package
│ ├── setup_motors.py # Motor configuration and setup
│ ├── calibrate.py # Motor calibration utilities
│ ├── record.py # Movement recording functionality
│ ├── replay.py # Movement replay functionality
│ ├── app/ # Application modules
│ │ └── voice/ # Voice interaction system
│ ├── follower/ # Follower mode functionality
│ ├── leader/ # Leader mode functionality
│ └── test/ # Hardware testing modules
└── uv.lock # Dependency lock file
- UV package manager
- Hardware components properly assembled (see main LeLamp documentation)
- Clone the runtime repository:
git clone https://github.com/humancomputerlab/lelamp_runtime.git
cd lelamp_runtime- Install UV (if not already installed):
curl -LsSf https://astral.sh/uv/install.sh | sh- Install dependencies:
# For basic functionality
uv sync
# For hardware support (Raspberry Pi)
uv sync --extra hardwareNote: For motor setup and control, LeLamp Runtime can run on your computer and you only need to run uv sync. For other functionality that connects to the head Pi (LED control, audio, camera), you need to install LeLamp Runtime on that Pi and run uv sync --extra hardware.
If you have LFS problems, run the following command:
GIT_LFS_SKIP_SMUDGE=1 uv syncIf your installation process is slow, use the following environment variable:
export UV_CONCURRENT_DOWNLOADS=1The runtime includes several key dependencies:
- feetech-servo-sdk: For servo motor control
- lerobot: Robotics framework integration
- livekit-agents: Real-time voice interaction
- numpy: Mathematical operations
- sounddevice: Audio input/output
- adafruit-circuitpython-neopixel: RGB LED control (hardware)
- rpi-ws281x: Raspberry Pi LED control (hardware)
Before using LeLamp, you need to set up and calibrate the servo motors:
- Find the servo driver port:
uv run lerobot-find-port- Setup motors with unique IDs:
uv run -m lelamp.setup_motors --id your_lamp_name --port the_port_found_in_previous_stepThe runtime includes comprehensive testing modules to verify all hardware components:
# Run with sudo for hardware access
sudo uv run -m lelamp.test.test_rgbuv run -m lelamp.test.test_audio# Test camera functionality
libcamera-hellouv run -m lelamp.test.test_motors --id your_lamp_name --port the_port_found_in_previous_stepThis can be run on your computer instead of the Raspberry Pi Zero 2W in the lamp head.
One of LeLamp's key features is the ability to record and replay movement sequences:
To record a movement sequence:
uv run -m lelamp.record --id your_lamp_name --port the_port_found_in_previous_step --name movement_sequence_nameThis will:
- Put the lamp in recording mode
- Allow you to manually manipulate the lamp
- Save the movement data to a CSV file
To replay a recorded movement:
uv run -m lelamp.replay --id your_lamp_name --port the_port_found_in_previous_step --name movement_sequence_nameThe replay system will:
- Load the movement data from the CSV file
- Execute the recorded movements with proper timing
- Reproduce the original motion sequence
Recorded movements are saved as CSV files with the naming convention:
{sequence_name}_{lamp_id}.csv
To start LeLamp's voice app upon booting. Create a systemd service file:
sudo nano /etc/systemd/system/lelamp.serviceAdd this content:
ini[Unit]
Description=Lelamp Runtime Service
After=network.target
[Service]
Type=simple
User=pi
WorkingDirectory=/home/pi/lelamp_runtime
ExecStart=/usr/bin/sudo uv run main.py console
Restart=always
RestartSec=5
[Install]
WantedBy=multi-user.targetThen enable and start the service:
sudo systemctl daemon-reload
sudo systemctl enable lelamp.service
sudo systemctl start lelamp.serviceFor other service controls:
# Disable from starting on boot
sudo systemctl disable lelamp.service
# Stop the currently running service
sudo systemctl stop lelamp.service
# Check status (should show "disabled" and "inactive")
sudo systemctl status lelamp.serviceSample apps to test LeLamp's capabilities.
To run a conversational agent on LeLamp, create a .env file with the following content in the root of this directory in your raspberry pi:
OPENAI_API_KEY=
LIVEKIT_URL=
LIVEKIT_API_KEY=
LIVEKIT_API_SECRET=On how to gather these information, please refer to LiveKit's guide.
Then you can run the agent app by:
# Only need to run this once
uv run -m lelamp.app.voice.agent download-files
# For conversational AI
uv run -m lelamp.app.voice.agent console-
Servo Connection Issues:
- Verify USB connection to servo driver
- Check port permissions
- Ensure proper power supply
- Ensure your servo have the right ID
-
Audio Problems:
- Verify ReSpeaker Hat installation
- Check ALSA configuration
- Test with
aplay -landarecord -l - Ensure you have unmuted your speaker by hitting M on the corresponding slider in alsamixer.
-
Permission Errors:
- Run RGB tests with
sudo - Check user permissions for hardware access
- Run RGB tests with
-
Sudo Sound Error:
Put the follwing into /etc/asound.conf:
# Default capture & playback device = Seeed 2-Mic
pcm.!default {
type plug
slave {
pcm "hw:3,0" # input/output device (Seeed 2-Mic)
}
}
ctl.!default {
type hw
card 3
}This is an open-source project by Human Computer Lab. Contributions are welcome through the GitHub repository.
Check the main LeLamp repository for licensing information.