Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 1.99 KB

2412.17612.md

File metadata and controls

5 lines (3 loc) · 1.99 KB

CoSurfGS:Collaborative 3D Surface Gaussian Splatting with Distributed Learning for Large Scene Reconstruction

3D Gaussian Splatting (3DGS) has demonstrated impressive performance in scene reconstruction. However, most existing GS-based surface reconstruction methods focus on 3D objects or limited scenes. Directly applying these methods to large-scale scene reconstruction will pose challenges such as high memory costs, excessive time consumption, and lack of geometric detail, which makes it difficult to implement in practical applications. To address these issues, we propose a multi-agent collaborative fast 3DGS surface reconstruction framework based on distributed learning for large-scale surface reconstruction. Specifically, we develop local model compression (LMC) and model aggregation schemes (MAS) to achieve high-quality surface representation of large scenes while reducing GPU memory consumption. Extensive experiments on Urban3d, MegaNeRF, and BlendedMVS demonstrate that our proposed method can achieve fast and scalable high-fidelity surface reconstruction and photorealistic rendering.

三维高斯点云(3D Gaussian Splatting,3DGS)在场景重建中表现出令人印象深刻的性能。然而,大多数现有的基于GS的表面重建方法主要集中在三维对象或有限的场景上。将这些方法直接应用于大规模场景重建将面临诸如高内存成本、过多的时间消耗以及缺乏几何细节等挑战,这使得在实际应用中难以实现。为了解决这些问题,我们提出了一种基于分布式学习的大规模表面重建的多代理协作快速3DGS表面重建框架。具体来说,我们开发了本地模型压缩(Local Model Compression,LMC)和模型聚合方案(Model Aggregation Schemes,MAS),以在减少GPU内存消耗的同时,实现大场景的高质量表面表示。在Urban3d、MegaNeRF和BlendedMVS上的广泛实验表明,我们提出的方法能够实现快速、可扩展的高保真表面重建和逼真渲染。