Real-time high-fidelity dense reconstruction in unconstrained environments

Establish visual SLAM methods that achieve high-fidelity, real-time dense 3D reconstruction in unconstrained environments.

Background

The paper surveys traditional visual SLAM approaches, distinguishing between sparse and dense methods, and highlights progress such as KinectFusion, BundleFusion, and MASt3R-SLAM that improve robustness and scalability. Despite these advances, the authors explicitly note that attaining both real-time performance and high-fidelity dense reconstruction across unconstrained, real-world scenarios remains unresolved.

VBGS-SLAM is introduced to address fragility and lack of uncertainty modeling in current Gaussian Splatting-based SLAM by formulating a probabilistic, variational inference framework. While the proposed method demonstrates strong results on several datasets, the broader goal of universally achieving high-fidelity, real-time dense reconstruction in unconstrained environments is identified as an open challenge.

References

Despite these advances, achieving high-fidelity, real-time dense reconstruction in unconstrained environments remains an open challenge.

VBGS-SLAM: Variational Bayesian Gaussian Splatting Simultaneous Localization and Mapping  (2604.02696 - Zhu et al., 3 Apr 2026) in Section 2.1 Traditional Visual SLAM (Related Works)