Papers
Topics
Authors
Recent
Search
2000 character limit reached

History-Conditioned Local Projector

Updated 2 February 2026
  • History-conditioned local projector is a data-driven operator that infers fine-grained transition statistics in observed Markov trajectories while accounting for hidden memory.
  • The methodology empirically constructs and normalizes history-conditioned transition histograms, then uses spectral analysis of eigenvalues to quantify memory timescales.
  • This approach enables the detection of deviations from the Markov property and guides state splitting to better represent hidden dynamics in complex systems.

A history-conditioned local projector is a data-driven operator on observed discrete-time Markov trajectories, which infers fine-grained transition statistics between observed (coarse-grained) states while accounting for the memory effects induced by hidden or unobservable degrees of freedom. Originally introduced in the context of Markov-state holography, this analysis allows one to empirically assess local deviations from the Markov property, detect evidence of hidden-state topology, and quantify the timescale of memory due to hidden-path decorrelations from projected high-dimensional dynamics (Zhao et al., 14 Mar 2025).

1. Definitions and Notational Framework

Let the full (microscopic) state space be Sfull=Sobs×ShiddenS_\mathrm{full} = S_\mathrm{obs} \times S_\mathrm{hidden}, where Sobs={1,2,,n}S_\mathrm{obs} = \{1,2,\dots,n\} is the space of directly observed (“lumped”) states, and Shidden(i)S_\mathrm{hidden}(i) denotes the set of microstates projected onto observed state ii: Shidden(i)={α:πobs(α)=i}S_\mathrm{hidden}(i)=\{\alpha : \pi_\mathrm{obs}(\alpha)=i\}.

A length-kk observation history at time tt is hk=(stk+1,,st)Sobskh_k = (s_{t-k+1}, \dots, s_t) \in S_\mathrm{obs}^k. For each observed transition jij \to i at times tt+1t \to t+1, and immediately preceding history hkh_k, define

  • Nij(hk)N_{i\leftarrow j}(h_k): number of observed jij\to i transitions with history hkh_k,
  • Nj(hk)=iSobsNij(hk)N_j(h_k)=\sum_{i\in S_\mathrm{obs}} N_{i\leftarrow j}(h_k),
  • Empirical history-conditioned transition probability:

Pijhk=Nij(hk)Nj(hk).P_{i \leftarrow j|h_k} = \frac{N_{i \leftarrow j}(h_k)}{N_j(h_k)}.

These transition probabilities form the entries of the history-conditioned local projector.

2. Mathematical Construction

For each ordered transition jij\to i and history hkh_k, the local projector-matrix entry is defined as

Πij(k)(hk)P(ij,hk)=#{(stk+1,,st=j,st+1=i) with hk}#{(stk+1,,st=j) with hk}\Pi^{(k)}_{i\leftarrow j}(h_k) \equiv P(i|j, h_k) = \frac{\#\{\,(s_{t-k+1},\dots,s_t=j,s_{t+1}=i)\text{ with }h_k\,\}}{\#\{\,(s_{t-k+1},\dots,s_t=j)\text{ with }h_k\,\}}

Alternatively, in model-based terms (where αt+1Shidden(j)\alpha_{t+1} \in S_\mathrm{hidden}(j) are microstates reached upon entering jj):

P(ij,hk)=αt+1Shidden(j)P(αt+1hk)P(iαt+1)αt+1Shidden(j)P(αt+1hk).P(i|j,h_k) = \frac{\sum_{\alpha_{t+1}\in S_\mathrm{hidden}(j)} P(\alpha_{t+1}|h_k)\, P(i|\alpha_{t+1})}{\sum_{\alpha_{t+1}\in S_\mathrm{hidden}(j)} P(\alpha_{t+1}|h_k)}.

Here P(αt+1hk)P(\alpha_{t+1}|h_k) is a path-sum over hidden trajectories compatible with hkh_k.

Given the microscopic splitting-probability matrix Φαβ\Phi_{\alpha\beta} and the sub-block QQ (within jj), the probability for the first exit from jj is:

Ωαα=P(αα,exit jnext observed state)=[R(IQ)1]αα\Omega_{\alpha' \leftarrow \alpha} = P(\alpha'|\alpha,\, \text{exit }j \to \text{next observed state}) = [R(\mathbf{I} - Q)^{-1}]_{\alpha'\alpha}

and the history-projected transition probability becomes:

P(ij,hk)=αShidden(j)P(αj,hk)βShidden(i)Ωβα.P(i|j,h_k) = \sum_{\alpha' \in S_\mathrm{hidden}(j)} P(\alpha'|j,h_k) \sum_{\beta \in S_\mathrm{hidden}(i)} \Omega_{\beta \leftarrow \alpha'}.

Fixing jj, the matrix Π(k)(j)\Pi^{(k)}(j) with entries PijhkP_{i\leftarrow j|h_k} can be analyzed spectrally. The eigenvalues satisfy 1=λ0(k)>λ1(k)λm1(k)1=\lambda_0^{(k)} > |\lambda_1^{(k)}|\ge\dots\ge|\lambda_{m-1}^{(k)}|, m=dim(Π)m=\dim(\Pi).

3. Data-Driven Estimation and Algorithm

Empirical construction proceeds by traversing a time series s[1..T]Sobss[1..T] \in S_\mathrm{obs}:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Input: trajectory s[1..T], max history K_max
Initialize: N[j][h_k][i] = 0 for all j, h_k, i
Loop:
  for t = K_max..T-1:
    for k=0..K_max:
      h_k = (s[t-k+1], ..., s[t])
      j = s[t]
      i = s[t+1]
      N[j][h_k][i] += 1
      N_j[j][h_k] += 1
Compute probabilities:
  for each j, h_k:
    if N_j[j][h_k]>0:
      for each i:
        P_{i←j|h_k} = N[j][h_k][i] / N_j[j][h_k]

Normalize each slice so that iPijhk=1\sum_i P_{i\leftarrow j|h_k} = 1. The maximal kk is increased until transition histograms converge, measured by the total-variation distance

TVDj(k)=12iSobshkPijhk(k)Pijhk(k1),\mathrm{TVD}_j(k) = \tfrac{1}{2} \sum_{i\in S_\mathrm{obs}} \sum_{h_k} \lvert P_{i\leftarrow j|h_k}^{(k)} - P_{i\leftarrow j|h_k}^{(k-1)} \rvert,

declaring convergence when TVDj(k)<ϵ\mathrm{TVD}_j(k) < \epsilon (e.g., ϵ=104\epsilon = 10^{-4}).

4. Memory Quantification and Spectral Analysis

The subleading eigenvalue λ1(k)\lambda_1^{(k)} of Π(k)\Pi^{(k)} determines the local memory timescale:

τ(k)=1/lnλ1(k).\tau^{(k)} = -1/\ln|\lambda_1^{(k)}|.

As kk increases, λ1(k)λ1()\lambda_1^{(k)} \to \lambda_1^{(\infty)} and τ(k)τ()\tau^{(k)} \to \tau^{(\infty)}, characterizing the number of steps over which hidden-path correlations decay. Large λ1(k)|\lambda_1^{(k)}| (i.e., long τ\tau) indicates slow decay and persistent hidden-path memory, signifying insufficient Markovianity at the observed level.

5. Canonical Example: Three-State System with Hidden Path

Consider observed states AA, BB, CC, where BB is actually two hidden microstates B1,B2B_1,B_2. The transition scheme is AB1A \leftrightarrow B_1, AB2A \leftrightarrow B_2, B1CB_1 \rightarrow C, B2CB_2 \rightarrow C, CAC \rightarrow A. The microscopic splitting probabilities are ΦB1C=p1\Phi_{B_1\rightarrow C} = p_1, ΦB2C=p2\Phi_{B_2\rightarrow C} = p_2, AA to B1,B2B_1,B_2 each with $1/2$, CA=1C \rightarrow A=1.

For k=0k=0 (no history), P(CB)=12(p1+p2)P(C|B) = \frac{1}{2}(p_1 + p_2). For k=1k=1 (history of one step), there are two histories, h1=ah_1=a (from AA) and h1=ch_1=c (from CC):

P(CB,h1=a)=pa=p1w1a+p2w2a, P(CB,h1=c)=pc=p1w1c+p2w2c,P(C|B, h_1=a) = p_a = p_1 w_{1a} + p_2 w_{2a}, \ P(C|B, h_1=c) = p_c = p_1 w_{1c} + p_2 w_{2c},

where w1a,w2a,w1c,w2cw_{1a}, w_{2a}, w_{1c}, w_{2c} are arrival fractions at B1,B2B_1,B_2 from AA or CC. Thus, ΠCB(1)\Pi^{(1)}_{C \leftarrow B} is a diagonal matrix with pa,pcp_a, p_c, and eigenvalues $1$, papc|p_a-p_c|. The timescale is then τ(1)=1/lnpapc\tau^{(1)} = -1/\ln|p_a - p_c|.

As kk\to\infty, path weights w1,w2w_1, w_2 approach their stationary distribution, λ1(k)p1p2\lambda_1^{(k)} \to |p_1 - p_2|, and τ()=1/lnp1p2\tau^{(\infty)} = -1/\ln|p_1 - p_2|.

6. Testing Markov Property and Inferring Hidden Structure

The local history-conditioned projector provides a direct test for hidden memory. If, for state jj, Πj(k)\Pi^{(k)}_{\cdot \leftarrow j} becomes independent of kk for all knk\ge n, then jj \to \cdot is Markov of order nn locally; any persistent kk-dependence reflects hidden-path memory or missing microstates.

If Πij(0)\Pi^{(0)}_{i \leftarrow j} shows multiple bars or more than one significant eigenvalue, one proposes splitting jj into j1,j2j_1, j_2, assigning rates so that (p1,p2)(p_1,p_2) match these bars. Recomputing Π(k)\Pi^{(k)} with the relump validates the split if Markovianity improves.

7. Practical Considerations and Limitations

Building all histograms up to maximal history KK has computational complexity O(TK)\mathcal{O}(T K) and storage cost O(SobsKSobs2)\mathcal{O}(|S_\mathrm{obs}|^K |S_\mathrm{obs}|^2) in the worst case. Reliable estimation of P(ijhk)P(i\leftarrow j|h_k) generally requires Nj(hk)100N_j(h_k)\gtrsim 100 per history; since history classes grow as Sobsk|S_\mathrm{obs}|^k, KK must be kept modest or histories coarsened by binning. Severe undersampling arises exponentially with kk. Convergence proofs require at least one observable Markov state; application to continuous observables depends on bin width δ\delta choices.


For detailed derivations and further implementation guidance, see (Zhao et al., 14 Mar 2025).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to History-Conditioned Local Projector.