Sai Haneesh Allu*, Jishnu Jaykumar P*, Ninad Khargonkar, Tyler Summers, Jian Yao, Yu Xiang
(* Equal Contribution)
Please cite this work if it helps in your research
@misc{2025hrt1,
title={HRT1: One-Shot Human-to-Robot Trajectory Transfer for Mobile Manipulation},
author={Sai Haneesh Allu* and Jishnu Jaykumar P* and Ninad Khargonkar and Tyler Summers and Jian Yao and Yu Xiang},
year={2025},
url={https://arxiv.org/abs/2510.21026},
}
Clone the repository recursively to include all submodules:
git clone --recursive https://github.com/IRVLUTD/HRT1 && cd HRT1
# Create a conda environment from scratch
conda create -n hrt1 python=3.10 # Python 3.10 required for samv2 and hamer dependencies
conda activate hrt1
# Set your CUDA_HOME environment variable
export CUDA_HOME=/usr/local/cudaThis codebase, built on top of the robokit and gto tools. Refer Readme document for each of the below utilities to setup the pipeline.
- Stage I:
dc/contains the HoloLens app for data capture. - Stage II & III:
vie/contains human demo data capture and video information extraction (vie) modules and grasp transfer.- Note: This also contains BundleSDF module to run object pose estimation during execution.
- Stage IV:
tto/contains the instructions for simulation , realworld setup and runtime scripts for trajectory tracking optimization and task execution.
To get the latest changes from the submodules
git submodule sync
git submodule update --remote --recursive