Papers
Topics
Authors
Recent
Search
2000 character limit reached

Uncertainty-Aware Diffusion Bridge Model

Updated 5 February 2026
  • The paper introduces UDBM, which relaxes strict terminal constraints by using a spatially-varying Gaussian to eliminate drift singularity.
  • It leverages pixel-wise uncertainty modulation and dual noise-path scheduling to adaptively manage stochastic dynamics in various applications.
  • Empirical results show UDBM’s robust performance in image restoration, trajectory prediction, and segmentation with improved PSNR, SSIM, and calibrated uncertainty.

The Uncertainty-Aware Diffusion Bridge Model (UDBM) constitutes a class of generative and inference models that reformulate classical diffusion bridge and Schrödinger bridge frameworks to incorporate explicit quantification and utilization of uncertainty. Originating in all-in-one image restoration, UDBM extends the classical diffusion bridge by introducing a relaxed terminal constraint informed by pixel-wise uncertainty, thus resolving canonical issues such as drift singularity at terminal time and enabling adaptive, task-robust bridge construction. UDBM architectures have subsequently been generalized to applications including trajectory prediction, image segmentation, robust stochastic estimation, and more, reflecting the versatility and rigorous probabilistic foundations this model provides (Tu et al., 29 Jan 2026, Capellera et al., 24 Mar 2025, Luo et al., 5 Oct 2025, Teimouri et al., 2024, Yoshioka, 5 Dec 2025).

1. Stochastic Transport Formulation and Relaxed Diffusion Bridge

UDBM models a stochastic bridge between two distinct data distributions, such as clean and degraded images (in restoration), by positing a continuous-time stochastic process {xt}t[0,1]\{x_t\}_{t\in[0,1]} evolving according to the Itô SDE:

dxt=f(xt,t)dt+g(t)dWtdx_t = f(x_t,t)\,dt + g(t)\,dW_t

The classical Doob’s hh-transform implements a diffusion bridge by conditioning the terminal distribution p(x1)p(x_1) onto the observed endpoint (e.g., degraded observation XlqX_{lq}):

dxt=[f(xt,t)+g(t)2xtlogh(xt,t)]dt+g(t)dWtdx_t = [f(x_t,t) + g(t)^2\nabla_{x_t}\log h(x_t,t)]\,dt + g(t)\,dW_t

where hh encodes the path-wise conditioning.

Standard bridges enforce a strict point-mass endpoint constraint p(x1)=δ(x1Xlq)p(x_1) = \delta(x_1-X_{lq}), inducing a drift correction that diverges as t1t \rightarrow 1, specifically scaling as Xlqxt/(1t)|X_{lq}-x_t|/(1-t) (drift singularity; Proposition 3.2 in (Tu et al., 29 Jan 2026)). UDBM replaces δ\delta with a spatially varying Gaussian:

p(x1)=N(x1;Xlq,σ2(u));σ2(u)=I+up(x_1) = \mathcal{N}(x_1; X_{lq},\,\sigma^2(u));\quad \sigma^2(u) = I + u

where uRH×W×Cu \in \mathbb{R}^{H\times W\times C} is a pixel-wise uncertainty map. The drift correction becomes:

g(t)2loghrelaxed(xt,t)Xlqxt(1t)+σ2(u)=O(1)|g(t)^2 \nabla \log h_{relaxed}(x_t, t)| \approx \frac{|X_{lq}-x_t|}{(1-t)+\sigma^2(u)} = O(1)

ensuring Lipschitz-boundedness as t1t \rightarrow 1 (Theorem 3.3 in (Tu et al., 29 Jan 2026)). This formulation naturally generalizes to continuous-time bridges for robust control and optimal transport with uncertainty (Yoshioka, 5 Dec 2025).

2. Pixel-wise/State-wise Uncertainty Modeling and Its Role

The uncertainty map uu is derived analytically or empirically as a proxy for local aleatoric uncertainty. In image restoration, a small pre-restorer network U^\hat{U} is trained via L1L_1 loss on clean/degraded pairs:

u=U^(xlq)xlqu = |\hat{U}(x_{lq}) - x_{lq}|

This uu correlates with spatial degradation intensity—large residuals reveal high uncertainty. uu modulates both the relaxed endpoint density and the time-varying diffusion coefficient g(t)g(t), aligning restoration difficulty with model stochasticity (Tu et al., 29 Jan 2026). In multi-agent trajectory and time-series settings, per-state or per-timestep uncertainty is estimated by augmenting the standard denoising loss with a negative log-likelihood term for the predicted noise standard deviation, yielding state-wise calibrated uncertainties in real space (Capellera et al., 24 Mar 2025, Luo et al., 5 Oct 2025).

3. Dual Modulation: Noise Schedule and Path Schedule

UDBM introduces a dual modulation strategy for both noise and path schedules:

  • Noise Schedule: The latent mixing term Bt(u)B_t(u) is spatially and temporally modulated:

Bt(u)=βbridge(u)t(1t)+βrelax(u)t2B_t(u) = \beta_{bridge}(u) t(1-t) + \beta_{relax}(u) t^2

with βbridge(u)=b(1+u)\beta_{bridge}(u) = b \cdot (1+u) and βrelax(u)=1+u\beta_{relax}(u) = 1+u. This design ensures high-entropy mixing in difficult regions and precise terminal variance, aligning heterogeneous degradations into a shared manifold (Tu et al., 29 Jan 2026).

  • Path Schedule: Adapting Schrödinger bridge PDEs with entropy-regularized optimal transport, UDBM reparameterizes time along pixel-wise uncertainty. The path coefficients

at(u)=T(u)tT(u)t+(1t)T(u),yt(u)=1at(u)a_t(u) = \frac{T(u)^t}{T(u)^t + (1-t)^{T(u)}},\quad y_t(u) = 1-a_t(u)

depend on T(u)=(1u)TOT+uTEOTT(u) = (1-u)T_{OT} + uT_{EOT}, blending optimal and entropy-regularized schedules.

These schedules control both the trajectory and stochastic geometry of the transport, matching the dynamics of the viscous Hamilton-Jacobi-Bellman (HJB) equations:

tφt+ϵ(u)Δφt=12φt2\partial_t \varphi_t + \epsilon(u)\Delta \varphi_t = \frac{1}{2}\Vert \nabla \varphi_t \Vert^2

with spatially adaptive damping in high-uncertainty regions (Tu et al., 29 Jan 2026).

4. Algorithmic Implementation

UDBM admits both efficient training and inference. In image restoration (Tu et al., 29 Jan 2026):

  • Training:

1. Sample (Xhq,Xlq), tUniform(0,1), ϵN(0,I)(X_{hq}, X_{lq}),\ t \sim \mathrm{Uniform}(0,1),\ \epsilon \sim \mathcal{N}(0,I). 2. Compute uu via pre-restorer. 3. Form bridge state xt=at(u)Xlq+yt(u)Xhq+Bt(u)ϵx_t = a_t(u) X_{lq} + y_t(u) X_{hq} + B_t(u)\epsilon. 4. Predict x0θ=Dθ(xt,t,u)x_0^\theta = D_\theta(x_t, t, u). 5. Minimize 1(x0θ,Xhq)\ell_1(x_0^\theta, X_{hq}).

  • Inference: (single-step, DDIM-inspired)
    1. Given XlqX_{lq}, compute uu and initialize x1=Xlq+σ2(u)ϵx_1 = X_{lq} + \sqrt{\sigma^2(u)} \epsilon.
    2. Predict x0x_0 and auxiliary noise.
    3. Apply closed-form deterministic update to obtain restored x0x_0 in one shot.

This architecture generalizes to multi-agent systems (state-wise conditioning and sampling), ventricle segmentation with Monte Carlo dropout for epistemic uncertainty (Teimouri et al., 2024), and robust stochastic process estimation (Yoshioka, 5 Dec 2025).

5. Theoretical Properties: Drift Singularity and Entropy Regularization

Classical diffusion bridges suffer from drift singularity due to strict endpoint pinning. UDBM’s relaxation leads to bounded drift corrections:

  • For strict terminals: g(t)2loghstrictXlqxt/(1t)|g(t)^2 \nabla \log h_{strict}| \sim |X_{lq}-x_t|/(1-t) \rightarrow \infty as t1t \rightarrow 1.
  • For relaxed Gaussian terminals: g(t)2loghrelaxedXlqxt/[(1t)+σ2(u)]=O(1)|g(t)^2 \nabla \log h_{relaxed}| \lesssim |X_{lq}-x_t|/[(1-t)+\sigma^2(u)] = O(1), eliminating singularities.

The entropic regularization, via pixel-wise ϵ(u)\epsilon(u), ensures smooth propagation even in highly uncertain regions and matches the viscous HJB and Schrödinger bridge frameworks. This theoretical machinery directly supports empirical robustness and convergence properties (Tu et al., 29 Jan 2026, Yoshioka, 5 Dec 2025).

6. Empirical Results and Generalization

In all-in-one image restoration, UDBM achieves:

  • Average PSNR: 32.61dB32.61\,\mathrm{dB}, SSIM: $0.933$ (UDBM-Large), outperforming prior SOTA (e.g., HOGformer at $31.57$ dB) with single-step inference at 47GFlops\approx47\,\mathrm{GFlops}.
  • Task-specific denoising (BSD68): Exceeds BioIR, deblurring (GoPro): 33.87dB/0.96833.87\,\mathrm{dB}/0.968 SSIM.
  • Robustness on composite and real benchmarks (CDD11): Best NIQE/MANIQA scores, demonstrating cross-distribution generalization (Tu et al., 29 Jan 2026).

In multi-agent trajectory tasks, UDBM (as U2Diff) delivers competitive minimum average/scene displacement errors, calibrated variance maps, and enables downstream rank-based error probability estimation (Spearman ρ0.56\rho\approx0.56–$0.78$ for mode ranking) (Capellera et al., 24 Mar 2025). In brain ventricle segmentation, the model yields high Dice (0.78 ± 0.27) and tight uncertainty–quality correlation (Pearson r=0.86r=-0.86) (Teimouri et al., 2024).

7. Broader Applications and Generalizations

Variants of the UDBM paradigm appear across domains:

UDBM generically adapts by choosing task-appropriate uncertainty proxies, bridge dynamics, and relaxation schedules, making it a unifying approach for uncertainty-aware stochastic modeling in high-dimensional spaces.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Uncertainty-Aware Diffusion Bridge Model (UDBM).