Papers
Topics
Authors
Recent
Search
2000 character limit reached

Learning-Based Stochastic Hybrid Systems

Updated 5 January 2026
  • LSHS is a modeling framework that describes systems with continuous stochastic dynamics and discrete mode transitions by integrating formal system theory with machine learning.
  • It uses unsupervised system identification methods, Gaussian Process regression, and neural architectures to capture nonlinear dynamics and learn rare event transitions.
  • LSHS enables robust multi-step prediction, state estimation, and real-time control, outperforming single-mode baselines in applications like robotics and power systems.

A Learning-Based Stochastic Hybrid System (LSHS) is a data-driven framework for modeling, predicting, and controlling systems whose dynamics exhibit both continuous stochastic evolution and discrete mode transitions. LSHS architectures merge formal stochastic hybrid system theory with machine learning techniques to address identification, prediction, classification, and control tasks in domains characterized by multimodal, piecewise-smooth, nonlinear, and switching behaviors. Prominent instantiations employ unsupervised system identification, scalable regression models, surrogate optimization, and discriminative classifiers operating over high-dimensional time-series data and latent state representations (Lee et al., 2017, Poli et al., 2021, Bortolussi et al., 2015, Varmazyari et al., 25 Aug 2025, Varmazyari et al., 29 Dec 2025, Dai et al., 2023, Fukasawa, 2020).

1. Mathematical Formulation and Model Structure

LSHS frameworks describe evolution using a finite set of discrete modes M={1,…,m}M=\{1,\ldots,m\} and corresponding continuous state spaces Fi⊂RdF_i \subset \mathbb{R}^d where Fi∩Fj=∅F_i \cap F_j = \varnothing for i≠ji \ne j. At time tt, a system state (xt,mt)(x_t, m_t) evolves according to mode-dependent stochastic differential (or difference) dynamics (Lee et al., 2017, Poli et al., 2021):

x˙=fi(x)+w,w∼N(0,Qi),x∈Fi, mt=i\dot{x} = f_i(x) + w, \quad w \sim \mathcal{N}(0, Q_i), \quad x \in F_i,\, m_t = i

Discrete transitions occur when xtx_t enters a guard region Gij⊂FiG_{ij} \subset F_i, triggering a mode switch i→ji \to j and a reset x+=Rij(x)\displaystyle x^+ = R_{ij}(x). The overall evolution is

xt+1={Rij(xt)+wt,xt∈Gij, mt=i, mt+1=j fi(xt)+wt,xt∈Fi∖⋃jGij, mt=i, mt+1=ix_{t+1} = \begin{cases} R_{ij}(x_t) + w_t, & x_t \in G_{ij},\, m_t=i,\, m_{t+1}=j \ f_i(x_t) + w_t, & x_t \in F_i \setminus \bigcup_j G_{ij},\, m_t=i,\, m_{t+1}=i \end{cases}

The discrete mode mtm_t may transition stochastically, either according to a learned classifier c(mt+1∣mt,xt)c(m_{t+1}|m_t, x_t) or explicitly parameterized transition intensities, e.g., Markov generators QQ in switching diffusion settings (Fukasawa, 2020).

LSHS generalizations extend to systems described by joint discrete/continuous variables, jump-diffusion processes, or control-affine nonlinearities:

  • Hybrid state: (xt,zt)(x_t, z_t), with xt∈Rnx_t \in \mathbb{R}^n, zt∈{1,…,K}z_t \in \{1,\ldots,K\}
  • Mode-conditioned continuous flow: xË™t=Fz(t,xt;ωz)\dot{x}_t = F_{z}(t, x_t; \omega_z)
  • Stochastic mode transitions: learned density pi→j(τ∣Htk;θi→j)p_{i \to j}(\tau | \mathcal{H}_{t_k}; \theta_{i \to j}) via normalizing flows (Poli et al., 2021)

2. Data-Driven Identification and Learning Algorithms

Unsupervised system identification in LSHS typically combines clustering, regression, and probabilistic inference (Lee et al., 2017, Poli et al., 2021):

  • Initial spectral K-means clustering segments input trajectories into tentative modes.
  • Mode-wise regression: Gaussian Process (GP) regression learns smooth, nonlinear vector fields fif_i restricted to FiF_i; jump maps RijR_{ij} are fit by GP or neural regression on observed (or synthetically oversampled) transition pairs.
  • Synthetic oversampling (SMOTE extension) augments rare transition events to stabilize jump map learning by interpolative sampling over observed jump pairs.
  • Particle filtering: A sequential-importance-sampling particle filter integrates the learned hybrid dynamics, using learned mode classifiers and regression models to propagate and re-weight samples.
  • Algorithmic iteration: Mode assignments and model parameters are updated in cycles until joint convergence, employing MAP reassignment and classifier retraining.

Neural architectures for end-to-end learning use segments-of-trajectory encoders and mode-conditioned neural ODEs (Poli et al., 2021). Normalizing flows are trained to model mode-to-mode event time distributions and jump densities. Loss functions combine trajectory reconstruction, jump-map MSE, and event time log-likelihoods.

Parameter learning for LSHS may also be posed as a likelihood maximization over logical constraints (MiTL), with statistical model checking (SMC) and Gaussian-process surrogate optimization for qualitative properties (Bortolussi et al., 2015). EM algorithms are used for parametric estimation under partial observation, with filtering over hidden discrete modes and closed-form M-steps for transition-rate matrices and drift parameters (Fukasawa, 2020).

3. Model-Based Prediction, Control, and State Estimation

LSHS provides a generative model suitable for n-step ahead prediction, state tracking, and optimal control:

  • Multi-step prediction: GPs yield predictive mean/variance for each mode, with hybrid models correctly propagating multimodal distributions across transitions.
  • State estimation: Particle filters integrate learned Gaussian-process-based transitions and resets, with re-weighting using observed data likelihoods.
  • Control: Hamilton-Jacobi-Bellman (HJB) equations for cost-minimizing stochastic optimal control are solved via Deep FBSDE controllers using LSTM approximators for the value function gradient; implementation includes soft penalties for state constraints and resets at event times (Dai et al., 2023).
  • Closed-loop observer augmentation allows rapid recognition of mismatches between nominal and actual system matrices, enhancing detection of hidden contingencies.

4. Classification and Real-Time Detection in LSHS

LSHS approaches classify physical, control, and measurement contingencies in power systems and grid applications (Varmazyari et al., 29 Dec 2025, Varmazyari et al., 25 Aug 2025):

  • Multivariate time-series features are extracted from high-frequency measurements, stacked into windows for input to Transformer or LSTM classifiers.
  • Discrete modes encode contingency types; linearized system matrices (A,B,C)(A, B, C) are mode-dependent, changing upon line outages or sensor faults.
  • Feature aggregation and classifier decision rules (KNN, SVM, LSTM, Transformer) facilitate rapid detection and categorization. Aggregated log-errors and time-series error signals are used as input features.
  • Eigen-structure mapping of closed-loop system matrices distinguishes contingency class by structural signature (changes in A+BKA+B K and A+GCA+G C).

LSHS classification frameworks achieve high detection accuracy (96–99%) and millisecond-scale latency in simulation on IEEE standard test cases.

5. Applications, Experimental Benchmarks, and Performance

LSHS frameworks have been empirically validated in robotics, power systems, gene networks, and networked control domains:

Task/Domain Methods Key Metrics
Bouncing ball/robot box Hybrid GP, EKF, SGP 2–5× log-likelihood gain
TCP congestion learning Neural Hybrid Automata v-measure ≈ 0.96
Power grid cont. detect Transformer, KNN (LSHS) Accuracy: 96–99%
Biped walking control DFBSDE controller (hybrid) Constraint violation: 0

Hybrid learning approaches outperform single-mode baselines near mode transitions due to bimodal predictive distributions and correct domain partitioning (Lee et al., 2017, Poli et al., 2021). Deep learning-based controllers are computationally efficient, with per-step inference times orders-of-magnitude faster than trajectory optimization (Dai et al., 2023).

6. Limitations, Scalability, and Future Directions

  • Scalability challenges arise in high-dimensional state and parameter spaces due to GP surrogate model complexity and simulation budget requirements for rare events.
  • Identifiability in qualitative-constraint-based learning depends critically on the informative richness of logical formulae.
  • Segmentation noise and mode-mixing can degrade performance of mode inference networks.
  • Extensions under active exploration, structural sparsity, scalable GP approximations, and adaptive online learning are under investigation.
  • Future work includes nonlinear and distributed model generalization, policy learning in manipulation tasks with hybrid dynamics, embedding of LSHS forward models into reinforcement learning frameworks, and closed-loop control under partial observation.

7. References to Primary Research

Key foundational and application-oriented LSHS research includes:

LSHS represents an active intersection of identification, prediction, control, and detection methodologies for systems exhibiting stochastic, discontinuous, and multi-modal regime evolution, with broad implications for both theory and real-world practice.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Learning Based Stochastic Hybrid System (LSHS).