| CARVIEW |
Select Language
HTTP/2 301
server: GitHub.com
content-type: text/html
location: https://nvlabs.github.io/nvs-tutorial-cvpr2020/
x-github-request-id: 9505:3ABDEF:A43B2C:B87F86:695403F2
accept-ranges: bytes
age: 0
date: Tue, 30 Dec 2025 16:55:16 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210096-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1767113717.725915,VS0,VE202
vary: Accept-Encoding
x-fastly-request-id: 33d7d3e5f906d69a709642afc532a10809b0c4d6
content-length: 162
HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Sat, 18 Jan 2025 00:40:32 GMT
access-control-allow-origin: *
etag: W/"678af880-2a71"
expires: Tue, 30 Dec 2025 17:05:17 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: 79D2:3157C7:A3AD17:B7EF55:695403F3
accept-ranges: bytes
age: 0
date: Tue, 30 Dec 2025 16:55:17 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210096-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1767113717.941121,VS0,VE206
vary: Accept-Encoding
x-fastly-request-id: 64e006bdeccb50c9d5a6fff10d9d759ea452f935
content-length: 3094
Novel View Synthesis: From Depth-Based Warping to Multi-Plane Images and Beyond
Rick Szeliski
Facebook
Pratul Srinivasan
UC Berkeley
Richard Tucker
Google
Olivia Wiles
U. of Oxford
CVPR 2020 Tutorial on
Novel View Synthesis: From Depth-Based Warping to Multi-Plane Images and Beyond
Novel view synthesis is a long-standing problem at the intersection of computer graphics and computer vision.
Seminal work in this field dates back to the 1990s, with early methods proposing to interpolate either between corresponding pixels from the input images, or between rays in space.
Recent deep learning methods enabled tremendous improvements to the quality of the results, and brought renewed popularity to the field.
The teaser above shows novel view synthesis from different recent methods. From left to right: Yoon et al. [1], Mildenhall et al. [2], Wiles et al. [3], and Choi et al. [4]. Images and videos courtesy of the respective authors.
We would like to thank again our speakers for the great talks, which made this tutorial really great.
You can also click on the links in the table below to go to specific talks.
We will share the slides from the talks soon.
Goal of the Tutorial
In this tutorial we will first introduce the problem, including offering context and a taxonomy of the different methods. We will then have talks by the researchers behind the most recent approaches in the field. At the end of the tutorial we will have a roundtable discussion with all the speakers.Date and Location
The tutorial took place on June 14th, 2020 within CVPR 2020.
Contact us here.
Organizers
Invited Speakers
Rick Szeliski
Facebook
Pratul Srinivasan
UC Berkeley
Richard Tucker
Google
Olivia Wiles
U. of Oxford
Program with Link to the Videos of the Talks
Slides available here.
| Talk Title | Speaker | |
| 9:20 - 9:50 | Novel View Synthesis: A Gentle Introduction [Video] | Orazio |
| 9:50 - 10:20 | Reflections on Image-Based Rendering [Video] | Rick |
| 10:20 - 10:50 | SynSin: Single Image View Synthesis [Video] | Olivia |
| 10:50 - 11:00 | Coffee break (15m) | |
|---|---|---|
| 11:00 - 11:30 | View synthesis with Multiplane Images [Video] | Richard |
| 11:30 - 12:00 | View Synthesis and Immersive Mixed Reality for VR devices [Video] | Gaurav |
| 12:00 - 12:45 | Lunch break (45m) | |
| 12:45 - 13:15 | View and Frame Interpolation for Consumer Light Field Cameras [Video] | Nima |
| 13:15 - 13:45 | NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis [Video] | Pratul |
| 13:45 - 14:15 | Novel View Synthesis from Dynamic Scenes [Video] | Jae Shin |
| 14:15 - 14:30 | Coffee break (15m) | |
| 14:30 - 15:30 | Round Table Discussion With the Invited Speakers [Video] |