Admissible Information Structure
- Admissible information structures are rigorously defined frameworks that characterize allowable information sets and probability assignments under constraints like subclassicality and non-anticipativity.
- They employ generalized event logics, directed acyclic graphs, and martingale filtrations to enforce local normalization and global compatibility for consistent inference.
- These frameworks enable tractable learning, equilibrium analysis, and arbitrage-free pricing across diverse applications from quantum foundations to financial markets and Bayesian games.
An admissible information structure is a rigorously defined mathematical framework for the types of information sets, probability assignments, and dynamic filtrations that underlie probabilistic modeling in fields such as quantum foundations, mathematical finance, reinforcement learning, and Bayesian games. Its purpose is to characterize precisely which information architectures and associated probability assignments or filtrations are logically and operationally consistent given the requirements of subclassicality, predictability, and statistical inference. The precise formalization depends on context, but key general features include enforcing local rules (e.g., classical additivity within measurement contexts or non-anticipativity in filtrations), guaranteeing global compatibility where possible, and specifying the minimal data needed for consistent inference or pricing.
1. Generalized Event Logics and Frame Functions
In the context of quantum foundations, Svozil's framework models admissible information structures as generalized event logics. Here, the primary object is a pair , where is a set of elementary "yes–no" events (atoms), and is a family of Boolean subalgebras called contexts, each representing a maximal set of mutually co-measurable observables (Svozil, 2015). Each context is a finite Boolean algebra, and different contexts may overlap by sharing atoms.
Probabilities are assigned as frame functions, , required to sum to $1$ over the atoms of every context: for each , . This enforces the additivity and normalization of probabilities within each context (subclassicality), though the global event logic need not be Boolean.
The class of admissible frame functions includes extreme points (two-valued measures), which assign $1$ to a single atom per context and $0$ to others (classical deterministic truth assignments), and the convex hull of such measures (classical probabilities). In quantum theory, frame functions correspond to density-matrix Born-rule assignments via Gleason's theorem. More exotic logics admit frame functions not realizable by either classical or quantum theories, highlighting the concept's generality (Svozil, 2015).
2. Admissibility in Sequential Decision Models and Reinforcement Learning
In sequential team and game theory, the information structure specifies how variables (system and actions), observed at sequential time steps, depend on previous events. The POST (Partially Observable Sequential Team) model formalizes this by encoding all conditional dependencies in a directed acyclic graph (DAG), where each node’s information set is a subset of prior indices (Altabaa et al., 2024).
Admissibility in this context is characterized statistically: an information structure is admissible (i.e., poly-sample learnable) if the size of the minimal d-separators between past and future observables in the intrinsic DAG remains polynomial in problem parameters. This ensures that efficient predictive state representations (low-rank) exist and that sample complexity bounds for learning or planning are tractable. The key is that admissible structures have small separating information sets, making exploration and statistical inference feasible. Classical models such as MDPs and POMDPs are special admissible cases, while more general ones are captured by the general POST framework (Altabaa et al., 2024).
3. Admissible Filtrations in Martingale Pricing and Financial Markets
In continuous-time financial mathematics, the admissible information structure is formalized as a right-continuous, complete subfiltration of a probability space , subject to asset prices being adapted semimartingales (Dominguez, 18 Jan 2026). Admissibility requires:
- Each remains an -adapted semimartingale.
- All admissible trading strategies (those which ensure non-negative wealth processes) are -predictable (non-anticipative).
- Enlarging cannot introduce arbitrage or mathematical pathologies, ruling out insider information.
Given an asset group , the minimal pricing-sufficient filtration is defined as the intersection of all filtrations under which an equivalent martingale measure exists for . This filtration is unique up to null sets, stable under restriction to subgroups, and compatible under aggregation if a common martingale measure exists. However, when there are three or more independent unspanned drivers, global compatibility fails—no single admissible filtration can make all assets martingales under a common measure. The obstruction is equivalent to a failure of admissible dynamic completeness. Numerical diagnostics using discrete Doob–Meyer decompositions empirically distinguish admissible from inadmissible filtrations: only the latter expose predictability that admissible structures suppress (Dominguez, 18 Jan 2026).
4. Admissibility and Observational Equivalence in Bayesian Games
In linear–quadratic–Gaussian (LQG) Bayesian games, the admissible information structure connects to observational equivalence (Miyashita, 2024). Each agent receives a private signal , and chooses an equilibrium action. The collection defines the full information architecture. Two structures are "admissible" (observationally equivalent) if they give rise to the same equilibrium action distribution; this is formalized as equality of the induced and .
A canonical one-dimensional signal structure exists for any observationally equivalent class, giving tight lower bounds on the precision (variance reduction on the fundamental state due to agent 's signal). The canonical structure is attained if signals are truly one-dimensional or there are no strategic interactions. The admissible range of each agent’s precision is , with the lower endpoint achieved uniquely by the canonical structure (Miyashita, 2024).
5. Key Examples and Illustrations
A comparative summary of prominent admissible information structures from various fields is shown below.
| Domain | Structural Object | Admissibility Criterion |
|---|---|---|
| Quantum foundations | Context-wise subclassicality (frame functions) (Svozil, 2015) | |
| RL/Sequential teams | POST DAG | Separator-width polynomial in problem size (Altabaa et al., 2024) |
| Finance | Filtration | Non-anticipativity, semimartingale property (Dominguez, 18 Jan 2026) |
| Bayesian games | equivalence | Equilibrium covariance matching (Miyashita, 2024) |
Quantum logic examples such as the pentagon Wright’s state and triangle configurations illustrate admissible frame functions beyond classical and quantum. In finance, the minimal obstruction with three disjoint slow drivers renders global martingale pricing inadmissible, underscoring inherent informational limits.
6. Synthesis and Minimal Scaffold for Empirical Theories
Admissible information structures provide a unified abstraction for constraints on probability, inference, and trading imposed by the information architecture of a system. The minimal data consist of:
- The set of observable or elementary events/variables (atoms or system nodes).
- Contexts or filtration/conditioning data, defining what constitutes an "accessible" or "admissible" portion of information.
- The set of admissible assignments: probability measures (frame functions), filtrations, or covariance parametrizations consistent with local and global constraints.
This scaffold is minimal in that any empirical theory—classical, quantum, economic, or computational—must specify its admissible information structure to derive meaningful predictive or pricing rules, and to delineate what is formally allowed or observationally indistinguishable in principle. The concept enables systematic characterization of the power and limits of inference and prediction across diverse theoretical and applied settings (Svozil, 2015, Dominguez, 18 Jan 2026, Altabaa et al., 2024, Miyashita, 2024).