| CARVIEW |
Select Language
HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Mon, 07 Jun 2021 19:01:59 GMT
access-control-allow-origin: *
etag: W/"60be6d27-15d5"
expires: Mon, 29 Dec 2025 00:26:20 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: B91B:36A0B4:819B08:91926D:6951C853
accept-ranges: bytes
age: 0
date: Mon, 29 Dec 2025 00:16:20 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210076-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1766967380.348514,VS0,VE212
vary: Accept-Encoding
x-fastly-request-id: 0aae99a491e6ec539295e11700c06eb448e5fc2e
content-length: 2177
Tanner Schmidt
tanner.schmidt at oculus.com
Tanner Schmidt
I am currently a Research Scientist at Facebook Reality Labs Research (FRLR). I am most interested in research with robotics and computer vision applications. In particular I enjoy working at the intersection of 3D geometry and modern machine learning techniques, where one can leverage both large amounts of data and principles of geometry that have been known for millennia.
Before FRLR, I did my Ph.D. in Computer Science and Engineering at the University of Washington. At UW, I was a member of the Robotics and State Estimation Lab advised by Dieter Fox. Prior to that, I did my bachelor’s in both Computer Science and Electrical and Computer Engineering at Duke University.
Publications:
-
Model-Based Self-Supervision for Fine-Grained Image Understanding
T. Schmidt
Ph.D. Dissertation
Self-supervised Visual Descriptor Learning for Dense Correspondence
T. Schmidt, R. A. Newcombe, D. Fox
Robotics and Automation Letters
- Best Robotic Vision Paper Award -
DART: Dense Articulated Real-Time Tracking with Consumer Depth Cameras
T. Schmidt, R. A. Newcombe, D. Fox
Autonomous Robots, Volume 39, Issue 3 (2015)
-
Depth-Based Tracking with Physical Constraints for Robot Manipulation
T. Schmidt, K. Hertkorn, R. A. Newcombe, Z. Marton, M. Suppa, D. Fox
IEEE International Conference on Robotics and Automation (ICRA) 2015
- Finalist, Best Robotic Vision Paper Award -
DART: Dense Articulated Real-Time Tracking
T. Schmidt, R. A. Newcombe, D. Fox
Robotics: Science and Systems (RSS) 2014