| CARVIEW |
Select Language
HTTP/2 301
server: GitHub.com
content-type: text/html
location: https://microsoft.github.io/FastNeRF/
x-github-request-id: 57F9:2D64E0:731CD3:810500:6950B74E
accept-ranges: bytes
age: 0
date: Sun, 28 Dec 2025 04:51:26 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210086-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1766897487.665084,VS0,VE195
vary: Accept-Encoding
x-fastly-request-id: 51fc48890a90ac10c4e74c8520b018363934e4c5
content-length: 162
HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Tue, 21 Sep 2021 10:47:43 GMT
access-control-allow-origin: *
strict-transport-security: max-age=31556952
etag: W/"6149b84f-194e"
expires: Sun, 28 Dec 2025 05:01:26 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: A3F1:2B0FD4:746A5D:825137:6950B74E
accept-ranges: bytes
age: 0
date: Sun, 28 Dec 2025 04:51:27 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210086-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1766897487.879399,VS0,VE217
vary: Accept-Encoding
x-fastly-request-id: 3564f5496927b8a533b4e49f40223438d36b09d4
content-length: 2045
FastNeRF: High-Fidelity Neural Rendering at 200FPS
Microsoft
Recent work on Neural Radiance Fields (NeRF) showed how neural networks can be used to encode complex 3D environments that can be rendered photorealistically from novel viewpoints. Rendering these images is very computationally demanding and recent improvements are still a long way from enabling interactive rates, even on high-end hardware. Motivated by scenarios on mobile and mixed reality devices, we propose FastNeRF, the first NeRF-based system capable of rendering high fidelity photorealistic images at 200Hz on a high-end consumer GPU. The core of our method is a graphics-inspired factorization that allows for (i) compactly caching a deep radiance map at each position in space, (ii) efficiently querying that map using ray directions to estimate the pixel values in the rendered image. Extensive experiments show that the proposed method is 3000 times faster than the original NeRF algorithm and at least an order of magnitude faster than existing work on accelerating NeRF, while maintaining visual quality and extensibility.
FastNeRF
High-Fidelity Neural Rendering at 200FPS
*denotes equal contribution
Abstract
Recent work on Neural Radiance Fields (NeRF) showed how neural networks can be used to encode complex 3D environments that can be rendered photorealistically from novel viewpoints. Rendering these images is very computationally demanding and recent improvements are still a long way from enabling interactive rates, even on high-end hardware. Motivated by scenarios on mobile and mixed reality devices, we propose FastNeRF, the first NeRF-based system capable of rendering high fidelity photorealistic images at 200Hz on a high-end consumer GPU. The core of our method is a graphics-inspired factorization that allows for (i) compactly caching a deep radiance map at each position in space, (ii) efficiently querying that map using ray directions to estimate the pixel values in the rendered image. Extensive experiments show that the proposed method is 3000 times faster than the original NeRF algorithm and at least an order of magnitude faster than existing work on accelerating NeRF, while maintaining visual quality and extensibility.
Video results
All the videos below were generated using cached FastNeRF at framerates ranging from 180FPS to over 1000FPS. The training data came from LLFF and 360 Synthetic datasets used in the NeRF paper.Citation
@article{garbin2021fastnerf,
title={FastNeRF: High-Fidelity Neural Rendering at 200FPS},
author={Garbin, Stephan J and Kowalski, Marek and Johnson, Matthew and Shotton, Jamie and Valentin, Julien},
journal={arXiv preprint arXiv:2103.10380},
year={2021}
}
title={FastNeRF: High-Fidelity Neural Rendering at 200FPS},
author={Garbin, Stephan J and Kowalski, Marek and Johnson, Matthew and Shotton, Jamie and Valentin, Julien},
journal={arXiv preprint arXiv:2103.10380},
year={2021}
}
