GANeRFine: Universal NeRF Output Refinement via GANs for High-Quality Results

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural Radiance Fields (NeRFs) have transformed 3D scene reconstruction and rendering, yet they often struggle to capture fine details and realistic textures, especially when trained on sparse datasets. To address these limitations, we introduce GANeRFine, a novel framework that integrates Generative Adversarial Networks (GANs) with NeRF models to enhance the output fidelity. GANs are renowned for producing highly realistic textures and intricate details, making them an ideal complement to NeRFs. GANeRFine leverages the strengths of both paradigms to refine output generated from both sparsely posed and densely posed images with known camera parameters. Our implementation spans four leading NeRF variants, NeRF, Mip-NeRF, Mip-NeRF 360, and ZipNeRF, demonstrating broad applicability across architectures. This level of photorealism is vital for constructing convincing virtual environments and lifelike avatars, foundational elements of immersive Metaverse applications. By improving visual fidelity from minimal input, GANeRFine supports scalable, interactive scene creation for dynamic Metaverse worlds. Extensive evaluations across diverse datasets show that GANeRFine significantly enhances the visual quality of novel views, outperforming existing models in SSIM and PSNR metrics, while achieving lower LPIPS and KID scores. Furthermore, GANeRFine enables near real-time inference, increasing the practicality of NeRF-based systems in latency-sensitive and resource-constrained environments. These advancements empower real-time 3D content generation for virtual worlds, augmented experiences, and live digital twins, positioning GANeRFine as a key engine for the next-generation Metaverse infrastructure. This integration not only pushes the boundaries of NeRF capabilities, but also sets a new standard for photorealistic 3D rendering.

Original languageEnglish (US)
Title of host publicationProceedings - 2025 International Conference on Metaverse Computing, Networking and Applications, MetaCom 2025
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages213-220
Number of pages8
ISBN (Electronic)9798331522551
DOIs
StatePublished - 2025
Externally publishedYes
Event3rd IEEE International Conference on Metaverse Computing, Networking and Applications, MetaCom 2025 - Seoul, Korea, Republic of
Duration: Aug 27 2025Aug 29 2025

Publication series

NameProceedings - 2025 International Conference on Metaverse Computing, Networking and Applications, MetaCom 2025

Conference

Conference3rd IEEE International Conference on Metaverse Computing, Networking and Applications, MetaCom 2025
Country/TerritoryKorea, Republic of
CitySeoul
Period8/27/258/29/25

All Science Journal Classification (ASJC) codes

  • Media Technology
  • Software
  • Artificial Intelligence
  • Computer Networks and Communications
  • Computer Science Applications
  • Hardware and Architecture

Keywords

  • 3D reconstruction
  • Generative adversarial networks
  • Image reconstruction
  • Neural radiance field

Fingerprint

Dive into the research topics of 'GANeRFine: Universal NeRF Output Refinement via GANs for High-Quality Results'. Together they form a unique fingerprint.

Cite this