Papers
Topics
Authors
Recent
Search
2000 character limit reached

NAS-NeRF: Generative Neural Architecture Search for Neural Radiance Fields

Published 25 Sep 2023 in cs.CV, cs.AI, and cs.LG | (2309.14293v3)

Abstract: Neural radiance fields (NeRFs) enable high-quality novel view synthesis, but their high computational complexity limits deployability. While existing neural-based solutions strive for efficiency, they use one-size-fits-all architectures regardless of scene complexity. The same architecture may be unnecessarily large for simple scenes but insufficient for complex ones. Thus, there is a need to dynamically optimize the neural network component of NeRFs to achieve a balance between computational complexity and specific targets for synthesis quality. We introduce NAS-NeRF, a generative neural architecture search strategy that generates compact, scene-specialized NeRF architectures by balancing architecture complexity and target synthesis quality metrics. Our method incorporates constraints on target metrics and budgets to guide the search towards architectures tailored for each scene. Experiments on the Blender synthetic dataset show the proposed NAS-NeRF can generate architectures up to 5.74$\times$ smaller, with 4.19$\times$ fewer FLOPs, and 1.93$\times$ faster on a GPU than baseline NeRFs, without suffering a drop in SSIM. Furthermore, we illustrate that NAS-NeRF can also achieve architectures up to 23$\times$ smaller, with 22$\times$ fewer FLOPs, and 4.7$\times$ faster than baseline NeRFs with only a 5.3% average SSIM drop. Our source code is also made publicly available at https://saeejithnair.github.io/NAS-NeRF.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (14)
  1. Nerf: Representing scenes as neural radiance fields for view synthesis. In ECCV, 2020.
  2. FastNeRF: High-fidelity neural rendering at 200fps. pages 14346–14355. URL https://openaccess.thecvf.com/content/ICCV2021/html/Garbin_FastNeRF_High-Fidelity_Neural_Rendering_at_200FPS_ICCV_2021_paper.html.
  3. Instant neural graphics primitives with a multiresolution hash encoding. 41(4):1–15. ISSN 0730-0301, 1557-7368. doi: 10.1145/3528223.3530127. URL https://dl.acm.org/doi/10.1145/3528223.3530127.
  4. Baking neural radiance fields for real-time view synthesis. pages 5875–5884. URL https://openaccess.thecvf.com/content/ICCV2021/html/Hedman_Baking_Neural_Radiance_Fields_for_Real-Time_View_Synthesis_ICCV_2021_paper.html.
  5. TensoRF: Tensorial radiance fields. In Shai Avidan, Gabriel Brostow, Moustapha Cissé, Giovanni Maria Farinella, and Tal Hassner, editors, Computer Vision – ECCV 2022, Lecture Notes in Computer Science, pages 333–350. Springer Nature Switzerland. ISBN 978-3-031-19824-3. doi: 10.1007/978-3-031-19824-3_20.
  6. DONeRF: Towards real-time rendering of compact neural radiance fields using depth oracle networks. 40(4):45–59. ISSN 1467-8659. doi: 10.1111/cgf.14340. URL https://onlinelibrary.wiley.com/doi/abs/10.1111/cgf.14340. _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1111/cgf.14340.
  7. DeRF: Decomposed radiance fields. pages 14153–14161. URL https://openaccess.thecvf.com/content/CVPR2021/html/Rebain_DeRF_Decomposed_Radiance_Fields_CVPR_2021_paper.html.
  8. KiloNeRF: Speeding up neural radiance fields with thousands of tiny MLPs. pages 14335–14345. URL https://openaccess.thecvf.com/content/ICCV2021/html/Reiser_KiloNeRF_Speeding_Up_Neural_Radiance_Fields_With_Thousands_of_Tiny_ICCV_2021_paper.html.
  9. FermiNets: Learning generative machines to generate efficient neural networks via generative synthesis. URL http://arxiv.org/abs/1809.05989.
  10. Neural architecture search with reinforcement learning. URL http://arxiv.org/abs/1611.01578.
  11. Alexander Wong. NetScore: Towards universal metrics for large-scale performance analysis of deep neural networks for practical on-device edge usage. In Fakhri Karray, Aurélio Campilho, and Alfred Yu, editors, Image Analysis and Recognition, Lecture Notes in Computer Science, pages 15–26. Springer International Publishing. ISBN 978-3-030-27272-2. doi: 10.1007/978-3-030-27272-2_2.
  12. Image quality assessment: from error visibility to structural similarity. 13(4):600–612. ISSN 1941-0042. doi: 10.1109/TIP.2003.819861. Conference Name: IEEE Transactions on Image Processing.
  13. The unreasonable effectiveness of deep features as a perceptual metric. pages 586–595. URL https://openaccess.thecvf.com/content_cvpr_2018/html/Zhang_The_Unreasonable_Effectiveness_CVPR_2018_paper.html.
  14. Nerfstudio: A modular framework for neural radiance field development. In Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Proceedings, pages 1–12. doi: 10.1145/3588432.3591516. URL http://arxiv.org/abs/2302.04264.
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.