You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Download the data from webpage. And put them into ./data/.
Data Structure
<DATA-DIR>
./annots //Natural language annotations where each file consisting of three sentences.
./motions //Raw motion data standardized as SMPL which is similiar to AMASS.
./motions_processed //Processed motion data with joint positions and rotations (6D representation) of SMPL 22 joints kinematic structure.
./split //Train-val-test split.
Modify config files ./configs/model.yaml ./configs/datasets.yaml and ./configs/train.yaml, and then run:
python tools/train.py --LPA
Or you can train TIMotion without LPA as:
python tools/train.py --epoch 1500
Evaluation
Modify config files ./configs/model.yaml and ./configs/datasets.yaml
python tools/eval.py --LPA --pth ${CHECKPOINT}
Citation
If you find our work useful in your research, please consider citing:
@inproceedings{wang2025timotion,
title={TIMotion: Temporal and Interactive Framework for Efficient Human-Human Motion Generation},
author={Wang, Yabiao and Wang, Shuo and Zhang, Jiangning and Fan, Ke and Wu, Jiafu and Xue, Zhucun and Liu, Yong},
booktitle={Proceedings of the Computer Vision and Pattern Recognition Conference},
pages={7169--7178},
year={2025}
}