You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Optional: Details about generating ground truth CAMS Embedding
If you open the meta file, eg. pliers_meta.torch, you will find that every manipulation sequence consists of two keys: data and cams. Under the key data, you will find ground truth data we copied from HOI4D, and the key cams reserves our generated ground truth CAMS Embedding.
The following script is a demo about how we generate our own CAMS Embedding from mocap data. You can modify and generate your CAMS Embedding from other mocap data (which guarantees that you can find the right contact pairs by simply calculating the sdf between object and hand, otherwise you may need some predefined policies to get the right contact).
cd data/preparation
python -u gen_cams_meta_pliers.py
Training Demo: Pliers
After finishing data preparation, you can use the following command to start training.
sh experiments/pliers/train.sh [1] [2] [3]
# [1] = GPU IDs you use, eg. 0, 1
# [2] = number of GPUs you use, eg. 2
# [3] = port
Synthesis and Evaluate Demo: Pliers
After training, you will get some outputs in experiments/pliers/tmp, use the following command to start synthesizing.
sh synthesizer/run.sh [1] [2]
# [1] = aforementioned output path, eg. experiments/pliers/tmp/val/
# [2] = meta data path, eg. data/meta/pliers_meta.torch
You have finished generation of new manipulation after synthesizing, the results are in experiments/pliers/synth. You can also step forward and run evaluation metrics using the following command.
sh eval/run.sh [1] [2]
# [1] = final results path, eg. experiments/pliers/synth
# [2] = name of the file saving evaluation result, eg. eval.txt
About
CAnonicalized Manipulation Spaces for Category-Level Functional Hand-Object Manipulation Synthesis