Thanks to visit codestin.com
Credit goes to github.com

Skip to content
/ GMR Public

[arXiv 2025] GMR: General Motion Retargeting. Retarget human motions into diverse humanoid robots in real time on CPU. Retargeter for TWIST.

License

YanjieZe/GMR

Repository files navigation

GMR: General Motion Retargeting

arXiv Paper arXiv Paper License: MIT Version Twitter Blog Blog

Banner for GMR

GMR

Key features of GMR:

  • Real-time high-quality retargeting, unlock the potential of real-time whole-body teleoperation, i.e., TWIST.
  • Carefully tuned for good performance of RL tracking policies.
  • Support multiple humanoid robots and multiple human motion data formats (See our table below).

Note

If you want this repo to support a new robot or a new human motion data format, send the robot files (.xml, .urdf, and meshes) / human motion data to Yanjie Ze or create an issue, we will support it as soon as possible. And please make sure the robot files you sent can be open-sourced in this repo.

This repo is licensed under the MIT License.

Starting from its release, GMR has been massively used by the community. See below for cool papers that use GMR:

  • arXiv 2025.08, HITTER: A HumanoId Table TEnnis Robot via Hierarchical Planning and Learning
  • arXiv 2025.08, Switch4EAI: Leveraging Console Game Platform for Benchmarking Robotic Athletics
  • arXiv 2025.05, TWIST: Teleoperated Whole-Body Imitation System

News & Updates

  • 2025-10-15: Now supporting PAL Robotics' Talos, the 15th humanoid robot.
  • 2025-10-14: GMR now supports Nokov BVH data.
  • 2025-10-14: Add a doc on ik config. See DOC.md
  • 2025-10-09: Check TWIST open-sourced code for RL motion tracking.
  • 2025-10-02: Tech report for GMR is now on arXiv.
  • 2025-10-01: GMR now supports converting GMR pickle files to CSV (for beyondmimic), check scripts/batch_gmr_pkl_to_csv.py.
  • 2025-09-25: An introduction on GMR is available on Bilibili.
  • 2025-09-16: GMR now supports to use GVHMR for extracting human pose from monocular video and retargeting to robot.
  • 2025-09-12: GMR now supports Tienkung, the 14th humanoid robot in the repo.
  • 2025-08-30: GMR now supports Unitree H1 2 and PND Adam Lite, the 12th and 13th humanoid robots in the repo.
  • 2025-08-28: GMR now supports Booster T1 for both 23dof and 29dof.
  • 2025-08-28: GMR now supports using exported offline FBX motion data from OptiTrack.
  • 2025-08-27: GMR now supports Berkeley Humanoid Lite, the 11th humanoid robot in the repo.
  • 2025-08-24: GMR now supports Unitree H1, the 10th humanoid robot in the repo.
  • 2025-08-24: GMR now supports velocity limits for the robot motors, use_velocity_limit=True by default in GeneralMotionRetargeting class (and we use 3*pi as the velocity limit by default); we also add printing of robot DoF/Body/Motor names and their IDs by default, and you can access them via robot_dof_names, robot_body_names, and robot_motor_names attributes.
  • 2025-08-10: GMR now supports Booster K1, the 9th robot in the repo.
  • 2025-08-09: GMR now supports Unitree G1 with Dex31 hands.
  • 2025-08-07: GMR now supports Galexea R1 Pro (this is a wheeled humanoid robot!) and KUAVO, the 7th and 8th humanoid robots in the repo.
  • 2025-08-06: GMR now supports HighTorque Hi, the 6th humanoid robot in the repo.
  • 2025-08-04: Initial release of GMR. Check our twitter post.

Demos

Demo 1
Retargeting LAFAN1 dancing motion to 5 robots.
GMR.mp4
Demo 2
Galexea R1 Pro robot (view 1).
galaxea_r1pro_KIT_3_walk_6m_straight_line04_stageii.mp4
Demo 3
Galexea R1 Pro robot (view 2).
galaxea_r1pro_Transitions_mazen_c3d_twistdance_jumpingtwist360_stageii.mp4
Demo 4
Switching robots by changing one argument.
GMR_screen_record.mp4
Demo 5
HighTorque robot doing a twist dance.
hightorque_hi_Transitions_mazen_c3d_twistdance_jumpingtwist360_stageii.mp4
Demo 6
Kuavo robot picking up a box.
kuavo_s45_ACCAD_Female1General_c3d_A5_-_pick_up_box_stageii.mp4
Demo 7
Unitree H1 robot doing a ChaCha dance.
unitree_h1_KIT_572_dance_chacha11_stageii.mp4
Demo 8
Booster T1 robot jumping (view 1).
booster_t1_29dof_01_01_stageii.mp4
Demo 9
Booster T1 robot jumping (view 2).
booster_t1_01_01_stageii.mp4
Demo 10
Unitree H1-2 robot jumping.
unitree_h1_2_01_01_stageii.mp4
Demo 11
PND Adam Lite robot.
pnd_adam_lite_ACCAD_MartialArtsWalksTurns_c3d_E5_-_retreat_stageii.mp4
Demo 12
Tienkung robot walking.
2025-09-10.15-11-25.mp4
Demo 13
Extracting human pose (GVHMR + GMR).
▶ Watch on Bilibili
Demo 14
PAL Robotics’ Talos robot fighting.
talos_fight_shortened.mp4
Demo 15
(Optional placeholder if you add a new one later!)
Coming soon...

Supported Robots and Data Formats

Assigned ID Robot/Data Format Robot DoF SMPLX (AMASS, OMOMO) BVH LAFAN1 FBX (OptiTrack) BVH Nokov More formats coming soon
0 Unitree G1 unitree_g1 Leg (2*6) + Waist (3) + Arm (2*7) = 29
1 Unitree G1 with Hands unitree_g1_with_hands Leg (2*6) + Waist (3) + Arm (2*7) + Hand (2*7) = 43 TBD
2 Unitree H1 unitree_h1 Leg (2*5) + Waist (1) + Arm (2*4) = 19 TBD TBD TBD
3 Unitree H1 2 unitree_h1_2 Leg (2*6) + Waist (1) + Arm (2*7) = 27 TBD TBD TBD
4 Booster T1 booster_t1 TBD TBD TBD
5 Booster T1 29dof booster_t1_29dof TBD TBD
6 Booster K1 booster_k1 Neck (2) + Arm (2*4) + Leg (2*6) = 22 TBD TBD
7 Stanford ToddlerBot stanford_toddy TBD TBD
8 Fourier N1 fourier_n1 TBD TBD
9 ENGINEAI PM01 engineai_pm01 TBD TBD
10 HighTorque Hi hightorque_hi Head (2) + Arm (2*5) + Waist (1) + Leg (2*6) = 25 TBD TBD
11 Galaxea R1 Pro galaxea_r1pro (this is a wheeled robot!) Base (6) + Torso (4) + Arm (2*7) = 24 TBD TBD
12 Kuavo kuavo_s45 Head (2) + Arm (2*7) + Leg (2*6) = 28 TBD TBD
13 Berkeley Humanoid Lite berkeley_humanoid_lite (need further tuning) Leg (2*6) + Arm (2*5) = 22 TBD TBD
14 PND Adam Lite pnd_adam_lite Leg (2*6) + Waist (3) + Arm (2*5) = 25 TBD TBD
15 Tienkung tienkung Leg (2*6) + Arm (2*4) = 20 TBD TBD
16 PAL Robotics' Talos pal_talos Head (2) + Arm (2*7) + Waist (2) + Leg (2*6) = 30 TBD TBD
More robots coming soon !
16 AgiBot A2 agibot_a2 TBD TBD TBD TBD
17 OpenLoong openloong TBD TBD TBD TBD

Installation

Note

The code is tested on Ubuntu 22.04/20.04.

First create your conda environment:

conda create -n gmr python=3.10 -y
conda activate gmr

Then, install GMR:

pip install -e .

After installing SMPLX, change ext in smplx/body_models.py from npz to pkl if you are using SMPL-X pkl files.

And to resolve some possible rendering issues:

conda install -c conda-forge libstdcxx-ng -y

Data Preparation

[SMPLX body model] download SMPL-X body models to assets/body_models from SMPL-X and then structure as follows:

- assets/body_models/smplx/
-- SMPLX_NEUTRAL.pkl
-- SMPLX_FEMALE.pkl
-- SMPLX_MALE.pkl

[AMASS motion data] download raw SMPL-X data to any folder you want from AMASS. NOTE: Do not download SMPL+H data.

[OMOMO motion data] download raw OMOMO data to any folder you want from this google drive file. And process the data into the SMPL-X format using scripts/convert_omomo_to_smplx.py.

[LAFAN1 motion data] download raw LAFAN1 bvh files from the official repo, i.e., lafan1.zip.

Human/Robot Motion Data Formulation

To better use this library, you can first have an understanding of the human motion data we use and the robot motion data we obtain.

Each frame of human motion data is formulated as a dict of (human_body_name, 3d global translation + global rotation).

Each frame of robot motion data can be understood as a tuple of (robot_base_translation, robot_base_rotation, robot_joint_positions).

Usage

Retargeting from SMPL-X (AMASS, OMOMO) to Robot

Note

NOTE: after install SMPL-X, change ext in smplx/body_models.py from npz to pkl if you are using SMPL-X pkl files.

Retarget a single motion:

python scripts/smplx_to_robot.py --smplx_file <path_to_smplx_data> --robot <path_to_robot_data> --save_path <path_to_save_robot_data.pkl> --rate_limit

By default you should see the visualization of the retargeted robot motion in a mujoco window. If you want to record video, add --record_video and --video_path <your_video_path,mp4>.

  • --rate_limit is used to limit the rate of the retargeted robot motion to keep the same as the human motion. If you want it as fast as possible, remove --rate_limit.

Retarget a folder of motions:

python scripts/smplx_to_robot_dataset.py --src_folder <path_to_dir_of_smplx_data> --tgt_folder <path_to_dir_to_save_robot_data> --robot <robot_name>

By default there is no visualization for batch retargeting.

Retargeting from GVHMR to Robot

First, install GVHMR by following their official instructions.

And run their demo that can extract human pose from monocular video:

cd path/to/GVHMR
python tools/demo/demo.py --video=docs/example_video/tennis.mp4 -s

Then you should obtain the saved human pose data in GVHMR/outputs/demo/tennis/hmr4d_results.pt.

Then, run the command below to retarget the extracted human pose data to your robot:

python scripts/gvhmr_to_robot.py --gvhmr_pred_file <path_to_hmr4d_results.pt> --robot unitree_g1 --record_video

Retargeting from BVH (LAFAN1, Nokov) to Robot

Retarget a single motion:

# single motion
python scripts/bvh_to_robot.py --bvh_file <path_to_bvh_data> --robot <path_to_robot_data> --save_path <path_to_save_robot_data.pkl> --rate_limit --format <format>

By default you should see the visualization of the retargeted robot motion in a mujoco window.

  • --rate_limit is used to limit the rate of the retargeted robot motion to keep the same as the human motion. If you want it as fast as possible, remove --rate_limit.
  • --format is used to specify the format of the BVH data. Supported formats are lafan1 and nokov.

Retarget a folder of motions:

python scripts/bvh_to_robot_dataset.py --src_folder <path_to_dir_of_bvh_data> --tgt_folder <path_to_dir_to_save_robot_data> --robot <robot_name>

By default there is no visualization for batch retargeting.

Retargeting from FBX (OptiTrack) to Robot

Offline FBX Files

Retarget a single motion:

  1. Install fbx_sdk by following these instructions and these instructions. You will probably need a new conda environment for this.

  2. Activate the conda environment where you installed fbx_sdk. Use the following command to extract motion data from your .fbx file:

cd third_party
python poselib/fbx_importer.py --input <path_to_fbx_file.fbx> --output <path_to_save_motion_data.pkl> --root-joint <root_joint_name> --fps <fps>
  1. Then, run the command below to retarget the extracted motion data to your robot:
conda activate gmr
# single motion
python scripts/fbx_offline_to_robot.py --motion_file <path_to_saved_motion_data.pkl> --robot <path_to_robot_data> --save_path <path_to_save_robot_data.pkl> --rate_limit

By default you should see the visualization of the retargeted robot motion in a mujoco window.

  • --rate_limit is used to limit the rate of the retargeted robot motion to keep the same as the human motion. If you want it as fast as possible, remove --rate_limit.

Online Streaming

We provide the script to use OptiTrack MoCap data for real-time streaming and retargeting.

Usually you will have two computers, one is the server that installed with Motive (Desktop APP for OptiTrack) and the other is the client that installed with GMR.

Find the server ip (the computer that installed with Motive) and client ip (your computer). Set the streaming as follows:

OptiTrack Streaming

And then run:

python scripts/optitrack_to_robot.py --server_ip <server_ip> --client_ip <client_ip> --use_multicast False --robot unitree_g1

You should see the visualization of the retargeted robot motion in a mujoco window.

Visualize saved robot motion

Visualize a single motions:

python scripts/vis_robot_motion.py --robot <robot_name> --robot_motion_path <path_to_save_robot_data.pkl>

If you want to record video, add --record_video and --video_path <your_video_path,mp4>.

Visualize a folder of motions:

python scripts/vis_robot_motion_dataset.py --robot <robot_name> --robot_motion_folder <path_to_save_robot_data_folder>

After launching the MuJoCo visualization window and clicking on it, you can use the following keyboard controls::

  • [: play the previous motion
  • ]: play the next motion
  • space: toggle play/pause

Speed Benchmark

CPU Retargeting Speed
AMD Ryzen Threadripper 7960X 24-Cores 60~70 FPS
13th Gen Intel Core i9-13900K 24-Cores 35~45 FPS
TBD TBD

Citation

If you find our code useful, please consider citing our related papers:

@article{joao2025gmr,
  title={Retargeting Matters: General Motion Retargeting for Humanoid Motion Tracking},
  author= {Joao Pedro Araujo and Yanjie Ze and Pei Xu and Jiajun Wu and C. Karen Liu},
  year= {2025},
  journal= {arXiv preprint arXiv:2510.02252}
}
@article{ze2025twist,
  title={TWIST: Teleoperated Whole-Body Imitation System},
  author= {Yanjie Ze and Zixuan Chen and João Pedro Araújo and Zi-ang Cao and Xue Bin Peng and Jiajun Wu and C. Karen Liu},
  year= {2025},
  journal= {arXiv preprint arXiv:2505.02833}
}

and this github repo:

@software{ze2025gmr,
  title={GMR: General Motion Retargeting},
  author= {Yanjie Ze and João Pedro Araújo and Jiajun Wu and C. Karen Liu},
  year= {2025},
  url= {https://github.com/YanjieZe/GMR},
  note= {GitHub repository}
}

Known Issues

Designing a single config for all different humans is not trivial. We observe some motions might have bad retargeting results. If you observe some bad results, please let us know! We now have a collection of such motions in TEST_MOTIONS.md.

Acknowledgement

Our IK solver is built upon mink and mujoco. Our visualization is built upon mujoco. The human motion data we try includes AMASS, OMOMO, and LAFAN1.

The original robot models can be found at the following locations:

About

[arXiv 2025] GMR: General Motion Retargeting. Retarget human motions into diverse humanoid robots in real time on CPU. Retargeter for TWIST.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages