You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Step 1: Download data from: https://graphics.tu-bs.de/people-snapshot# Step 2: Preprocess using our script
python scripts/peoplesnapshot/preprocess_PeopleSnapshot.py --root <PATH_TO_PEOPLESNAPSHOT> --subject male-3-casual
# Step 3: Download SMPL from: https://smpl.is.tue.mpg.de/ and place the model in ./data/SMPLX/smpl/# └── SMPLX/smpl/# ├── SMPL_FEMALE.pkl# ├── SMPL_MALE.pkl# └── SMPL_NEUTRAL.pkl
Quick Start
Quickly learn and animate an avatar with bash ./bash/run-demo.sh
Play with Your Own Video
Here we use the in the wild video provided by Neuman as an example:
create a yaml file specifying the details about the sequence in ./confs/dataset/. In this example it's provided in ./confs/dataset/neuman/seattle.yaml.
download the data from Neuman's Repo, and run cp <path-to-neuman-dataset>/seattle/images ./data/custom/seattle/
run the bash script bash scripts/custom/process-sequence.sh ./data/custom/seattle neutral to preprocess the images, which
We are grateful to the developers and contributors of these repositories for their hard work and dedication to the open-source community. Without their contributions, our project would not have been possible.
@article{jiang2022instantavatar,
author = {Jiang, Tianjian and Chen, Xu and Song, Jie and Hilliges, Otmar},
title = {InstantAvatar: Learning Avatars from Monocular Video in 60 Seconds},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
}
About
InstantAvatar: Learning Avatars from Monocular Video in 60 Seconds (CVPR 2023)