Bridging 2D semantic material labels to physically accurate parameters for Digital Twin sensor simulation

Develop a robust and generalizable method to convert 2D semantic material segmentation labels extracted from images into physically accurate material parameter sets (e.g., BRDF/PBR parameters) assigned to reconstructed 3D surfaces, in order to enable faithful physics-based sensor simulation in Digital Twin environments.

Background

The paper proposes a camera-only pipeline that reconstructs large-scale scenes using 3D Gaussian Splatting, projects 2D material labels onto mesh surfaces, and assigns physics-based materials for simulation. While they demonstrate a practical approach by mapping semantic labels to PBR textures from a database, they highlight that a principled and accurate mapping from 2D semantic labels to physically grounded material parameters remains unresolved.

Accurately linking semantic material categories identified in images (e.g., asphalt, glass, metal) to quantitative, physically meaningful parameters required by physics-based rendering and sensor models is critical for realistic Digital Twin simulations. The authors identify this gap as an open challenge, indicating that current practices rely on heuristics or predefined databases rather than a validated, general solution.

References

However, bridging the gap between semantic material labels from 2D segmentation and physically accurate material parameters for physics-based sensor simulation in Digital Twin environments remains an open challenge.

Material-informed Gaussian Splatting for 3D World Reconstruction in a Digital Twin  (2511.20348 - Silva et al., 25 Nov 2025) in Section 2.5 (Physics-Based Materials for Rendering)