HTTP/2 200
content-type: text/html; charset=utf-8
x-frame-options: DENY
vary: Sec-Fetch-Dest, Sec-Fetch-Mode, Sec-Fetch-Site
cache-control: no-cache, no-store, max-age=0, must-revalidate
pragma: no-cache
expires: Mon, 01 Jan 1990 00:00:00 GMT
date: Tue, 30 Dec 2025 10:33:23 GMT
p3p: CP="This is not a P3P policy! See g.co/p3phelp for more info."
content-security-policy: base-uri 'self';object-src 'none';report-uri /_/view/cspreport;script-src 'report-sample' 'nonce-uko1L7vRIayDW7F2MOdCnA' 'unsafe-inline' 'unsafe-eval';worker-src 'self';frame-ancestors https://google-admin.corp.google.com/
cross-origin-opener-policy: unsafe-none
reporting-endpoints: default="/web-reports?jobset=prod&wcrumsspbp=false&bl=editors.sites-viewer-frontend_20251210.02_p1&app=25&clss=1&context=eJwNz2lUVAUABeDnm_euhiCy6UAK4xu1QVxGyiKBgEZmSI1M05m0IyiOgAsYIKBWwlFKTT1quS8UopKIqaBS6tHEyiVLyjIKXHGUdWZYHFyS7o_vnHvvv-tW0aewl1ko620WJvmahffIa4NZGETPt5mFPtvNglRkFnwp6phZMN8yC8lU1WwW_qCZ_hYhjToGWgQp0CIsHmwRCigp1iJkUg59RmNDbgjPCcNvCIEUdbdNNJE-r1MMI3t-p-iiJfNdYj5diugSq-l0ZJd4gR5Zu8TndGBNl_gtzb78WJxHaUK3mE1dA7pFYWC32JHbLT6jl6eoVOEUla9SmWjyfZVqOuXax0p5JG6LkNzou0MR0g8UFxYpTSLn-BjpMf277E3pHqXDIOXQ5kUGaTclZRuk-SQeMUhu9PoJgxRDrpBYqZvC5sVK0TTkw1hpBK20xUpryTLGKCXSpeNGqYa-XG-SiqmEKumByyTZqWh-nHSQ7nm_JTVRbc_xUj3lTCmUl9OdlEK5gf4-WSjfpur2r-Qa0rxTJOtoVu1eOZU-tu-VC-hUSrFcRQcXFMvH6PK9Yvl3Uh_dJ2voavx--U_yKDgg-5F_UIk8iHTvl8h6mtx4UJ5O38SXykfpzpxSuYHOlJTKP1L1uTK5hjKSDstLqXHhYbmNdpQ1ykV08WKTfI0sUc1yIs043ywn0Z-WFrmOPHe2yP1Jdd8dvemIpwcqaWilB0bSjQt9cIuyQz3xCb1d6ImpZNvXF61U-tAL5ZTS6IVF1FPvDU_qofjgBSp_zQenKDrMB3FUPoudQut8MZayvf3wCW1a6ocd9OBTP9hpeVE_fEYv1fbDKHJb0B_e9CStP9zS-2O1Ro2NlDxGjQarGk6qKFPjNHVdUkO4rMZHX_tjJV0v8kct5awJwHJ6Yg-A6AjAblcA9lH9hhfRTJtjBmAHzRg_AIl0qzQQF_8JxFXS2QOhp6nOQEynEzOCcHZuEC5QY0UQ2mhCTw2mUPlrGn7WoClVg9ZKDR7Rsr2DsILG2QdhIi0YpSCLXjUpCCe_xQoG0tMCBapPFZxapaCK5n-lIJPOXFVwntx_VeBFLnpGxt8UTKASKiOvawr60brnCjZT0WgtDtJ_EVrIkVocidKikppTteggxxItOmnrMi120fTrWsymhG8HI5k6KgbjGY03D8VkumcZiiZyyTp0U5uHDk9IGa7DMPq5SIff6C-bDjep0jMY56jeKxjNVOMTjDv01DcYKr9gVAvhuFIajj-oND4C5dQtRKJnj0iE947EBFq0JRK5NEP9BpKodtMbqCerKgrzyG1IFLypdng06umcIRrXSHshGiG00BCDxVSzLQYN12LgpPzGGKymFf4GfE4bjQZsp35WA9TkXGDAY7KrxqGb_CaNw0Dy-ICZbraOg418koxQU3KeEemkLTYihPYcMGI_vX7RiBhqyTKhk5RDJgyjtjITnlDebRNW0Rcj47CTqm7F4QrdXHMXNkp4eBdzyPvdegSQ2-N6eNPWETbsoow8G5aS5rwNOrKkPUQiLSx-iMWkd29AGC151IB82l3XiH1U3dyIGrJOa8Kx3CZ8Tw_WNSEpqAUptMXagj3UEtGKTtKdbIX7KjsOHbGjgj6MdECZ6cAwykp34NAKB-6vdWDyBge21jrQctsB_StOJM51IoXaNzvxlAqOO7Gd6ia24Zd5bfA53AY1TfFthy2hHfGz23E2ugPTlnagdlkH3q3qQC59PdOFkGQXfnLvQrTSBW_3Xu3Xf72MviW71rvEAGlaqjVHGW6dk5qVnpE5MjM1y5o5IpubNWPE3Iz0tCxr2pyEUH3omNGho_Uj9aEJi0b_D7nHtyY&build-label=editors.sites-viewer-frontend_20251210.02_p1&imp-sid=CLfjlJ2O5ZEDFbLh3gIdCxUIHA&is-cached-offline=false"
document-policy: include-js-call-stacks-in-crash-reports
referrer-policy: strict-origin-when-cross-origin
content-encoding: gzip
server: ESF
x-xss-protection: 0
x-content-type-options: nosniff
set-cookie: NID=527=lEK0xCWzjM2Vh3i5x_WZ-QP9u3QJz8NqvSP03j2_RWX3nM0UzzrvhiOAtwlwRkyTbOLm39bY5XINTBNY8CSN9rPp1oIdV4kqMPtkuz-lOxyNFtMhThPR5ZnvL72LKjS858jsPTIOnhEYy8G5xy-c2cvC-BZ_AY_qWIzei2gd3eu8_C-UAutQLQvykdwqYwqOHq6kmEVu0pwB-0wq; expires=Wed, 01-Jul-2026 10:33:23 GMT; path=/; domain=.google.com; Secure; HttpOnly; SameSite=none
alt-svc: h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
TaskGrasp
Despite the enormous progress and generalization in robotic grasping in recent years, existing methods have yet to scale and generalize task-oriented grasping to the same extent. This is largely due to the scale of the datasets both in terms of the number of objects and tasks studied. We address these concerns with the TaskGrasp dataset which is more diverse both in terms of objects and tasks, and an order of magnitude larger than previous datasets. The dataset contains 250K task-oriented grasps for 56 tasks and 191 objects along with their RGB-D information. We take advantage of this new breadth and diversity in the data and present the GCNGrasp framework which uses the semantic knowledge of objects and tasks encoded in a knowledge graph to generalize to new object instances, classes and even new tasks. Our framework shows a significant improvement of around 12% on held-out settings compared to baseline methods which do not use semantics. We demonstrate that our dataset and model are applicable for the real world by executing task-oriented grasps on a real robot on unknown objects.
@inproceedings{murali2020taskgrasp,
title={Same Object, Different Grasps: Data and Semantic Knowledge for Task-Oriented Grasping},
author={Murali, Adithyavairavan and Liu, Weiyu and Marino, Kenneth and Chernova, Sonia and Gupta, Abhinav},
booktitle={Conference on Robot Learning},
year={2020}
}