TY - GEN
T1 - GANeRFine
T2 - 3rd IEEE International Conference on Metaverse Computing, Networking and Applications, MetaCom 2025
AU - Dash, Ankan
AU - Gu, Jingyi
AU - Wang, Guiling
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Neural Radiance Fields (NeRFs) have transformed 3D scene reconstruction and rendering, yet they often struggle to capture fine details and realistic textures, especially when trained on sparse datasets. To address these limitations, we introduce GANeRFine, a novel framework that integrates Generative Adversarial Networks (GANs) with NeRF models to enhance the output fidelity. GANs are renowned for producing highly realistic textures and intricate details, making them an ideal complement to NeRFs. GANeRFine leverages the strengths of both paradigms to refine output generated from both sparsely posed and densely posed images with known camera parameters. Our implementation spans four leading NeRF variants, NeRF, Mip-NeRF, Mip-NeRF 360, and ZipNeRF, demonstrating broad applicability across architectures. This level of photorealism is vital for constructing convincing virtual environments and lifelike avatars, foundational elements of immersive Metaverse applications. By improving visual fidelity from minimal input, GANeRFine supports scalable, interactive scene creation for dynamic Metaverse worlds. Extensive evaluations across diverse datasets show that GANeRFine significantly enhances the visual quality of novel views, outperforming existing models in SSIM and PSNR metrics, while achieving lower LPIPS and KID scores. Furthermore, GANeRFine enables near real-time inference, increasing the practicality of NeRF-based systems in latency-sensitive and resource-constrained environments. These advancements empower real-time 3D content generation for virtual worlds, augmented experiences, and live digital twins, positioning GANeRFine as a key engine for the next-generation Metaverse infrastructure. This integration not only pushes the boundaries of NeRF capabilities, but also sets a new standard for photorealistic 3D rendering.
AB - Neural Radiance Fields (NeRFs) have transformed 3D scene reconstruction and rendering, yet they often struggle to capture fine details and realistic textures, especially when trained on sparse datasets. To address these limitations, we introduce GANeRFine, a novel framework that integrates Generative Adversarial Networks (GANs) with NeRF models to enhance the output fidelity. GANs are renowned for producing highly realistic textures and intricate details, making them an ideal complement to NeRFs. GANeRFine leverages the strengths of both paradigms to refine output generated from both sparsely posed and densely posed images with known camera parameters. Our implementation spans four leading NeRF variants, NeRF, Mip-NeRF, Mip-NeRF 360, and ZipNeRF, demonstrating broad applicability across architectures. This level of photorealism is vital for constructing convincing virtual environments and lifelike avatars, foundational elements of immersive Metaverse applications. By improving visual fidelity from minimal input, GANeRFine supports scalable, interactive scene creation for dynamic Metaverse worlds. Extensive evaluations across diverse datasets show that GANeRFine significantly enhances the visual quality of novel views, outperforming existing models in SSIM and PSNR metrics, while achieving lower LPIPS and KID scores. Furthermore, GANeRFine enables near real-time inference, increasing the practicality of NeRF-based systems in latency-sensitive and resource-constrained environments. These advancements empower real-time 3D content generation for virtual worlds, augmented experiences, and live digital twins, positioning GANeRFine as a key engine for the next-generation Metaverse infrastructure. This integration not only pushes the boundaries of NeRF capabilities, but also sets a new standard for photorealistic 3D rendering.
KW - 3D reconstruction
KW - Generative adversarial networks
KW - Image reconstruction
KW - Neural radiance field
UR - https://www.scopus.com/pages/publications/105020786658
UR - https://www.scopus.com/pages/publications/105020786658#tab=citedBy
U2 - 10.1109/MetaCom65502.2025.00043
DO - 10.1109/MetaCom65502.2025.00043
M3 - Conference contribution
AN - SCOPUS:105020786658
T3 - Proceedings - 2025 International Conference on Metaverse Computing, Networking and Applications, MetaCom 2025
SP - 213
EP - 220
BT - Proceedings - 2025 International Conference on Metaverse Computing, Networking and Applications, MetaCom 2025
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 27 August 2025 through 29 August 2025
ER -