You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please follow INSTALL.md to prepare for the environment.
Demo
First download our released weights for KGNv1 and/or KGNv2. Put them under the folder ./exp. Then run the demo on real world data via:
bash experiments/demo_kgnv{1|2}.sh
NOTE: the released KGNv1 weight is trained on single-object data, while the KGNv2 weight is trained on multi-object data.
You should seem example results (from kgnv2):
Train and evaluate
Data generation
The dataset used in the papers can be downloaded from the links: sinlge-object and multi-object. Download, extract, and put them in the ./data/ folder.
Alternatively, you can also generate the data by yourself. For single-object data generation:
Please consider citing our work if you find the code helpful:
@inproceedings{chen2022keypoint,
title={Keypoint-GraspNet: Keypoint-based 6-DoF Grasp Generation from the Monocular RGB-D input},
author={Chen, Yiye and Lin, Yunzhi and Xu, Ruinian and Vela, Patricio},
booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
year={2023}
}
@article{chen2023kgnv2,
title={KGNv2: Separating Scale and Pose Prediction for Keypoint-based 6-DoF Grasp Synthesis on RGB-D input},
author={Chen, Yiye and Xu, Ruinian and Lin, Yunzhi and Chen, Hongyi and Vela, Patricio A},
journal={IEEE International Conference on Intelligent Robots and Systems (IROS)},
year={2023}
}
About
[ICRA 2023 & IROS 2023] Code release for Keypoint-GraspNet (KGN) and Keypoint-GraspNet-V2 (KGNv2)