详细信息
MPM-GS: Optimizing Sparse-View 3D Scene Reconstruction with Virtual View Rendering and Multimodal Regularization ( EI收录)
文献类型:期刊文献
英文题名:MPM-GS: Optimizing Sparse-View 3D Scene Reconstruction with Virtual View Rendering and Multimodal Regularization
作者:Yao, Mingyuan[1,2,3]; Zhang, Shulong[1,2,3]; Huo, Yukang[1,2,3]; Zhao, Jiayin[1,2,3]; Liu, Xiao[1,2,3]; Xue, Lin[4]; Chen, Yingyi[1,2,3]; Wang, Haihua[1,2,3]
第一作者:Yao, Mingyuan
机构:[1] National Innovation Center for Digital Fishery, China; [2] Key Laboratory of Smart Farming Technologies for Aquatic Animal and Livestock, Ministry of Agriculture and Rural Affairs, China; [3] College of Information and Electrical Engineering, China Agricultural University, Beijing, China; [4] Smart City College, Beijing Union University, China
第一机构:National Innovation Center for Digital Fishery, China
年份:2025
外文期刊名:Proceedings of the International Joint Conference on Neural Networks
收录:EI(收录号:20255019674659)
语种:英文
摘要:3D Gaussian Splatting is widely used in 3D reconstruction and has applications in novel view synthesis and scene generation. Recent work has addressed this problem by leveraging multi-view data for high-quality 3D scene reconstruction. Unfortunately, these approaches suffer from overfitting with sparse-view data due to insufficient view information, resulting in artifacts and aliasing. In contrast, we propose a novel method, MPM-GS, which incorporates monocular depth estimation, virtual views, and regularization techniques to address these challenges and improve reconstruction quality under sparse-view conditions. This fixes the overfitting and artifact issues, however, it does not solve all the limitations of sparse-view data in extreme edge cases. Consequently, we develop a novel densification strategy to optimize Gaussian point distributions and improve scene accuracy. While promising, this densification process is non-trivial, as it requires balancing efficiency with rendering fidelity. Therefore, we further optimize Gaussian points and generate new, representative points to enhance both accuracy and computational efficiency. We evaluate MPM-GS both qualitatively and quantitatively on the Tanks & Temples and LLFF datasets, achieving excellent rendering quality and fast rendering speed, particularly in novel view synthesis. ? 2025 IEEE.
参考文献:
正在载入数据...
