Thanks to visit codestin.com
Credit goes to github.com

Skip to content

xiaohu-art/tram

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚃 TRAM

Official implementation for the paper:
TRAM: Global Trajectory and Motion of 3D Humans from in-the-wild Videos
Yufu Wang, Ziyun Wang, Lingjie Liu, Kostas Daniilidis
[Project Page]

Installation

  1. Clone this repo with the --recursive flag.
git clone --recursive https://github.com/xiaohu-art/tram.git
  1. Creating a new anaconda environment.
conda create -n tram python=3.10 -y
conda activate tram
  1. Install corresponding dependencies.
conda install nvidia/label/cuda-11.8.0::cuda-toolkit cuda-nvcc=11.8 -y 
# you should check with "nvcc --version" and "which nvcc" to ensure you use the correct nvcc in your anaconda environment
conda install pytorch==2.0.0 torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia -y

pip install iopath # iopath 0.1.8
pip install 'git+https://github.com/facebookresearch/detectron2.git@a59f05630a8f205756064244bf5beb8661f96180'

conda install -c fvcore -c iopath -c conda-forge fvcore iopath
conda install -c bottler nvidiacub
conda install pytorch3d -c pytorch3d

conda install pytorch-scatter -c pyg
conda install -c conda-forge suitesparse

bash install.sh
  1. Compile DROID-SLAM. If you encountered difficulty in this step, please refer to its official release for more info. In this project, DROID is modified to support masking.
cd thirdparty/DROID-SLAM
python setup.py install
cd ../..

I add '-DTHRUST_IGNORE_CUB_VERSION_CHECK' in DROID-SLAM/setup.py to avoid the version check of cub and cuda.

Prepare data

Register at SMPLify and SMPL, whose usernames and passwords will be used by our script to download the SMPL models. In addition, we will fetch trained checkpoints and an example video. Note that thirdparty models have their own licenses.

Run the following to fetch all models and checkpoints to data/

bash scripts/download_models.sh

Run demo on videos

This project integrates the complete 4D human system, including tracking, slam, and 4D human capture in the world space. We separate the core functionalities into different scripts, which should be run sequentially. Each step will save its result to be used by the next step. All results will be saved in a folder with the same name as the video.

# 1. Run Masked Droid SLAM (also detect+track humans in this step)
python scripts/estimate_camera.py --video "./example_video.mov" 
# -- You can indicate if the camera is static. The algorithm will try to catch it as well.
python scripts/estimate_camera.py --video "./another_video.mov" --static_camera

# 2. Run 4D human capture with VIMO.
python scripts/estimate_humans.py --video "./example_video.mov"

# 3. Put everything together. Render the output video.
python scripts/visualize_tram.py --video "./example_video.mov"

Running the above three scripts on the provided video ./example_video.mov will create a folder ./results/exapmle_video and save all results in it. Please see available arguments in the scripts.

You can also directly run the bash script run.sh with modifying your own video path.

Get joint positions and root orientation

While rendering the output video, I have also saved the joint positions and root orientation used for retarget pipeline afterward.

For me, I record my video with FPS=30 and interpolate the relative values to TARGET_FPS=50. You can also modify it to your necessarities.

Acknowledgements

We benefit greatly from the following open source works, from which we adapted parts of our code.

In addition, the pipeline includes Detectron2, Segment-Anything, and DEVA-Track-Anything.

Citation

@article{wang2024tram,
  title={TRAM: Global Trajectory and Motion of 3D Humans from in-the-wild Videos},
  author={Wang, Yufu and Wang, Ziyun and Liu, Lingjie and Daniilidis, Kostas},
  journal={arXiv preprint arXiv:2403.17346},
  year={2024}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published