| CARVIEW |
Select Language
HTTP/2 301
server: GitHub.com
content-type: text/html
location: https://oakink.net/
x-github-request-id: 4EBB:3827E5:91416E:A2F3A1:6952A42B
accept-ranges: bytes
age: 0
date: Mon, 29 Dec 2025 15:54:19 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210062-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1767023660.740199,VS0,VE200
vary: Accept-Encoding
x-fastly-request-id: 831595e86eae9a71d7b5b7d2fe419c45b7435252
content-length: 162
HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Fri, 27 Dec 2024 10:45:00 GMT
access-control-allow-origin: *
etag: W/"676e852c-4112"
expires: Mon, 29 Dec 2025 16:04:20 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: 5B84:328FD3:90E862:A2840D:6952A42B
accept-ranges: bytes
date: Mon, 29 Dec 2025 15:54:20 GMT
via: 1.1 varnish
age: 0
x-served-by: cache-bom-vanm7210094-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1767023660.259078,VS0,VE204
vary: Accept-Encoding
x-fastly-request-id: 85b597c4557faeb8dbd23e5bc6d0e437de0c53a9
content-length: 4587
OakInk Dataset CVPR2022 | 手物交互数据库
For researchers in China, you can download OakInk from the alternative mirror: 百度云盘 (hrt9)
After download all the files, you need to complete the
Google Form to get the annotation file.
Arrange all zip files into the directory, eg.
Details of dataset splits for various tasks, please refer to Data splitting.
Load OakBase and visualize object parts and attributes, please refer to demo_oak_base.py.
Load OakInk-Image and OakInk-Shape for visualization, please refer to Load and visualize.
Train hand mesh recovery models on OakInk-Image, please refer to OakInk-HMR.
Train grasp generation models on OakInk-Shape, please refer to OakInk-Grasp-Generation.
OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction
1Shanghai Jiao Tong University,
2Shanghai Qi Zhi Institute
* Equal contribution ✉ Corresponding author
CVPR 2022
Update
- Jun,19,2024:   OakInk2 @ CVPR2024 is released !
- Apr,04,2024:   OakInk could be easily cloned from Huggingface/Dataset at OakInk-v1
- Dec,11,2023:   Grasp Generation models on OakInk-Shape are released!
- Feb,11,2023:   OakBase is released!
- Jan,03,2023:   Hand Mesh Recovery models on OakInk-Image are released!
- Oct,18,2022:   OakInk public v2.1 is released!
expand to see details
Within this update, several artifacts have been fixed, including: wrong poses, time delay, and contact surface mismatches; NOTE: If you downloaded the OakInk dataset before 11:00 AM October 18, 2022, UTC, You only need to replace the previous anno.zip by this newly released: anno_v2.1.zip (access via Google Forms), unzip it and keep the same file structures as before, and install the latest OakInk Toolkit. - Jul,26,2022:   Tink has been made public.
- Jun,28,2022:   OakInk public v2 & OakInk Toolkit -- a Python dataloader, are released!
- Mar,03,2022:   OakInk got accepted by CVPR 2022.
About
OakInk contains three datasets:
- OakBase: Object Affordance Knowledge (Oak) base, including objects' part-level segmentation and attributes.
- OakInk-Image: a video dataset with 3D hand-object pose and shape annotations.
- OakInk-Shape: a 3D grasping pose dataset with hand and object mesh models.
Download
OakInk
Download at: Hugging Face.For researchers in China, you can download OakInk from the alternative mirror: 百度云盘 (hrt9)
Arrange all zip files into the directory, eg.
$OAKINK_DIR/zipped as follow
$OAKINK_DIR/zipped
├── OakBase.zip
├── image
│ ├── anno_v2.1.zip # access via Google Forms
│ ├── obj.zip
│ └── stream_zipped
│ ├── oakink_image_v2.z01
│ ├── ...
│ ├── oakink_image_v2.z10
│ └── oakink_image_v2.zip
└── shape
├── metaV2.zip
├── OakInkObjectsV2.zip
├── oakink_shape_v2.zip
└── OakInkVirtualObjectsV2.zip
and follow the
instruction to verify checksums and unzip the files.
Resources
Details of dataset annotations, please refer to Data documentation.Details of dataset splits for various tasks, please refer to Data splitting.
Load OakBase and visualize object parts and attributes, please refer to demo_oak_base.py.
Load OakInk-Image and OakInk-Shape for visualization, please refer to Load and visualize.
Train hand mesh recovery models on OakInk-Image, please refer to OakInk-HMR.
Train grasp generation models on OakInk-Shape, please refer to OakInk-Grasp-Generation.
BibTeX
@InProceedings{YangCVPR2022OakInk,
author = {Yang, Lixin and Li, Kailin and Zhan, Xinyu and Wu, Fei and Xu, Anran and Liu, Liu and Lu, Cewu},
title = {{OakInk}: A Large-Scale Knowledge Repository for Understanding Hand-Object Interaction},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2022},
}