mjlab combines Isaac Lab's proven API with best-in-class MuJoCo physics to provide lightweight, modular abstractions for RL robotics research and sim-to-real deployment.
⚠️ BETA PREVIEW mjlab is in active development. Expect breaking changes and missing features during the beta phase. There is no stable release yet. The PyPI package is only a snapshot — for the latest fixes and improvements, install from source or Git.
mjlab requires an NVIDIA GPU for training (via MuJoCo Warp). macOS is supported only for evaluation, which is significantly slower.
# Install uv if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | shRun the demo (no installation needed):
uvx --from mjlab --with "mujoco-warp @ git+https://github.com/google-deepmind/mujoco_warp@486642c3fa262a989b482e0e506716d5793d61a9" demoThis launches an interactive viewer with a pre-trained Unitree G1 agent tracking a reference dance motion in MuJoCo Warp.
❓ Having issues? See the FAQ.
From source (recommended during beta):
git clone https://github.com/mujocolab/mjlab.git
cd mjlab
uv run demoFrom PyPI (beta snapshot):
uv add mjlab "mujoco-warp @ git+https://github.com/google-deepmind/mujoco_warp@486642c3fa262a989b482e0e506716d5793d61a9"For full setup instructions, see the Installation Guide.
Train a Unitree G1 humanoid to follow velocity commands on flat terrain:
MUJOCO_GL=egl uv run train Mjlab-Velocity-Flat-Unitree-G1 --env.scene.num-envs 4096Evaluate a policy while training (fetches latest checkpoint from Weights & Biases):
uv run play --task Mjlab-Velocity-Flat-Unitree-G1-Play --wandb-run-path your-org/mjlab/run-idTrain a Unitree G1 to mimic reference motions. mjlab uses WandB to manage reference motion datasets:
-
Create a registry collection in your WandB workspace named
Motions -
Set your WandB entity:
export WANDB_ENTITY=your-organization-name -
Process and upload motion files:
MUJOCO_GL=egl uv run scripts/tracking/csv_to_npz.py \ --input-file /path/to/motion.csv \ --output-name motion_name \ --input-fps 30 \ --output-fps 50 \ --render # Optional: generates preview video
Note: For detailed motion preprocessing instructions, see the BeyondMimic documentation.
MUJOCO_GL=egl uv run train Mjlab-Tracking-Flat-Unitree-G1 --registry-name your-org/motions/motion-name --env.scene.num-envs 4096
uv run play --task Mjlab-Tracking-Flat-Unitree-G1-Play --wandb-run-path your-org/mjlab/run-idRun tests:
make testFormat code:
uvx pre-commit install
make formatmjlab is licensed under the Apache License, Version 2.0.
The third_party/ directory contains files from external projects, each with its own license:
- isaaclab/ — NVIDIA Isaac Lab (BSD-3-Clause)
When distributing or modifying mjlab, comply with:
- The Apache-2.0 license for mjlab’s original code
- The respective licenses in
third_party/
mjlab wouldn't exist without the excellent work of the Isaac Lab team, whose API design and abstractions mjlab builds upon.
Thanks to the MuJoCo Warp team — especially Erik Frey and Taylor Howell — for answering our questions, giving helpful feedback, and implementing features based on our requests countless times.