Papers
Topics
Authors
Recent
Search
2000 character limit reached

MARS: Maximum-Entropy Reconstruction

Updated 31 January 2026
  • Maximum-entropy Reconstruction (MARS) is a method that solves ill-posed inverse problems by choosing the solution with the highest entropy, ensuring the least biased inference from available data.
  • It employs mathematical foundations like Shannon and relative entropy maximization with constraints, using iterative optimization techniques such as Newton-Raphson, L-BFGS-B, and adaptive support methods.
  • MARS is applied in diverse fields including astrophysical mass mapping, probabilistic density estimation, and network inference in finance, delivering robust regularization and uncertainty quantification.

Maximum-entropy Reconstruction (MARS) is a family of methods for solving ill-posed inverse problems by selecting, among all admissible solutions consistent with given constraints, the one that maximizes (relative) entropy. Entropy maximization regularizes these inverse problems by minimally committing to information not supported by the data; the resulting solution is the most "uninformative" or least biased subject to the constraints. MARS encompasses classic applications in function and probability distribution reconstruction from moments, free-form lensing reconstructions in astrophysics, network inference in economics and finance, low-rank matrix recovery, and adaptive image regularization, among others. This article synthesizes key mathematical foundations, algorithms, and applications of MARS, with citations to representative developments in astrophysics, statistics, econophysics, and applied inverse problems.

1. Mathematical Foundations

The core principle of MARS is to resolve underdetermined inference by maximizing Shannon entropy (or Kullback–Leibler divergence for relative entropy) over a set of models constrained by known measurements:

  • Shannon entropy: S[p]=p(x)lnp(x)dxS[p] = - \int p(x) \ln p(x) dx.
  • Relative entropy (cross-entropy): S[pm]=p(x)ln[p(x)/m(x)]dxS[p\Vert m] = - \int p(x) \ln [\, p(x)/m(x) \,] dx, where m(x)m(x) is a default model or prior.

In classical moment-inverse problems, MARS solves: maximizeS[p] subject toxkp(x)dx=μk, k=0,,M, p(x)0,\begin{aligned} \text{maximize} \quad & S[p] \ \text{subject to} \quad & \int x^k p(x) dx = \mu_k,~k=0,\ldots,M,~p(x)\geq 0, \end{aligned} with or without explicit support constraints, depending on the problem domain (Biswas et al., 2010, Andreychenko et al., 2014). The unique solution is of exponential form: p(x;λ)=1Z(λ)exp(k=1Mλkxk)p(x;\lambda) = \frac{1}{Z(\lambda)} \exp\left(-\sum_{k=1}^M \lambda_k x^k \right) with normalization Z(λ)Z(\lambda) (partition function) fixed by the moment constraints.

For discrete distributions, including unbounded support, the sum replaces the integral, and the Lagrangian dual minimization is implemented over an adaptively extended support interval to ensure tail-accuracy and computational stability (Andreychenko et al., 2014).

In regularized inverse problems, e.g., mass mapping from lensing (Cha et al., 2022), the MARS objective combines a data-fidelity (likelihood) term χ2\chi^2 and an entropy penalty: f[κ]=χ2[κ]+rR[κ,p]f[\kappa] = \chi^2[\kappa] + r \, R[\kappa, p] where R[κ,p]R[\kappa, p] is the pixel-level cross-entropy between the reconstructed mass map κ\kappa and a prior pp, controlled by regularization parameter rr.

2. Algorithmic Structure and Optimization

Table 1 summarizes algorithmic structure for various problem classes.

Domain Entropy Form Constraint Type Optimization Routine
Function/moment problems S[p]S[p] or S[pm]S[p\Vert m] Moments (hard/soft) Newton-Raphson/coordinate descent
Discrete distributions S[g]S[g] Discrete moments Damped Newton, adaptive support
Strong lensing maps R[κ,p]R[\kappa,p] (KL) Image positions, smoothness L-BFGS-B, Adam, prior smoothing
Network construction S[X]S[X] or S[P]S[\mathbb{P}] Margins, sparsity Alternating scaling, QP/NNLS
Markov models S(T)S(T) (conditional) Autocorrelation, norm Lagrangian solution, iterative

For function and moment problems, the dual minimization over Lagrange multipliers is unconstrained, convex, and efficiently solved by iterative Newton-type methods, often with multiplicative or coordinate-wise updates for moment matching (Biswas et al., 2010, Zhang et al., 2023). Discrete-support problems benefit from adaptive support enlargement for efficiency and accuracy (Andreychenko et al., 2014).

Spatial inference tasks (e.g., mass maps) apply MARS to high-dimensional parameterizations, using entropy as a regularizer to suppress spurious fluctuations while retaining data-driven structure. Iterative prior updating—by convolving/smoothing the solution—enforces global smoothness and suppresses overfitting (Cha et al., 2022, Cha et al., 2023, Cha et al., 2023). The two-step (coarse-to-fine) and multiresolution algorithms accelerate convergence and promote physical plausibility.

For network and Markov systems, MARS reconstructs (possibly sparse) connectivity or transition matrices via entropy maximization subject to empirical constraints, using block-wise scaling, fixed-point, or alternating optimization schemes (Andrecut, 2017, Gangi et al., 2015, Hazan, 2018, Chliamovitch et al., 2014).

3. Applications Across Scientific Domains

3.1 Gravitational Lensing and Astrophysics

MARS has enabled model-independent, free-form reconstruction of mass distributions in strong and weak gravitational lensing. The method suppresses overfitting prevalent in standard free-form methods by enforcing quasi-uniqueness via cross-entropy regularization. Key results include:

  • Convergence to source positions at \sim0.001 arcseconds.
  • <1%<1\% accuracy in radial mass profile recovery within multiply-imaged regions (Cha et al., 2022).
  • Ablation tests on Abell 1689 and Abell 2744 demonstrate excellent spatial alignment between entropy-regularized mass peaks and brightest cluster galaxy positions, without using light-traces-mass priors (Cha et al., 2022, Cha et al., 2023, Cha et al., 2023).

3.2 Moment Problem and Probabilistic Inverse Problems

MARS provides a robust mechanism for reconstructing continuous or discrete probability densities from incomplete moment data. For one-dimensional distributions with finite or infinite support, the entropy-maximizing solution is unique, smooth, and converges exponentially in L1L^1 as the number of moments increases (Biswas et al., 2010, Andreychenko et al., 2014, Zhang et al., 2023). For Laplace transform inversion—a classic ill-posed scenario—MARS achieves superresolution, accurately determining densities from a handful of moments by exploiting exponential-family structure (Gzyl, 2016).

3.3 Network Reconstruction in Economics and Finance

MARS and its sparsity-regularized extensions (e.g., sparse-RAS, FiCM+NNLS) allow reconstruction of interbank and macroeconomic transaction networks consistent with marginal data (e.g., bank asset/liability or national account balances) (Andrecut, 2017, Hazan, 2018). This enables statistically robust systemic risk assessment, realistic contagion cascade simulation, and principled scenario testing under partial observability. Sparsity penalties correct the classical ME bias toward fully-connected networks, essential for faithful systemic fragility analysis (Andrecut, 2017, Gangi et al., 2015).

3.4 Regularized and Information-Optimal Reconstruction

In low-rank matrix sensing, MARS (as MaxEnt) has been operationalized to design measurement schemes that maximize observation entropy, achieving improved error bounds and near-optimal information gain over random sampling (Mak et al., 2017). For imaging (e.g., confocal microscopy), adaptive, structure-aware maximum-entropy regularization, implemented at multiple resolutions, systematically outperforms total-variation when sampling is highly incomplete and incoherence is low (Francis et al., 2019).

3.5 Stochastic Processes and Markov Chains

MARS extends to estimation of Markov transition matrices from short, noisy time series by maximizing entropy rate subject to empirical autocorrelation and normalization, at times outperforming direct frequency estimation in small-sample settings and under non-stationarity (Chliamovitch et al., 2014).

4. Phase Transitions, Default Model Sensitivity, and Theoretical Guarantees

The typical performance of MARS in high-dimensional inverse problems depends critically on the default model. Replica-symmetric analysis reveals sharp phase transitions in recovery accuracy: even small discrepancies between the default model/prior and the true distribution can trigger a transition from perfect to failed reconstruction as the measurement-to-unknowns ratio crosses a critical threshold. For continuous prior mismatch, the minimum required data rate grows as O(ϵ2)O(\epsilon^2) with the prior error ϵ\epsilon (Hitomi et al., 4 Apr 2025).

Comparison with 1\ell_1-norm (compressed sensing) methods shows MARS outperforms 1\ell_1 when the default is correct, but can rapidly degrade otherwise—highlighting the importance of prior selection and hyperparameter tuning in practical deployments (Hitomi et al., 4 Apr 2025).

The entropy convergence and Kullback–Leibler (CS) bounds guarantee that as moments or constraints increase and entropy stabilizes, the solution approaches the true target in total variation (Biswas et al., 2010, Gzyl, 2016). This theoretical underpinning is essential for quantifying uncertainty in high-dimensional inference.

5. Implementation Strategies and Practical Considerations

MARS algorithms generally reduce to convex dual optimization, enabling efficient and guaranteed-convergent solutions. Computational challenges such as high-dimensionality (e.g., O(105)O(10^5) parameters in cluster lensing), ill-conditioning (moment problems), and unbounded support (discrete distributions) are mitigated by:

Hyperparameter tuning is empirically guided by diagnostic curves (e.g., image-plane RMS vs. entropy-weight), and validated on synthetic and real benchmarks.

6. Extensions and Domain-Specific Adaptations

MARS frameworks have been extended to applications where constraints are soft (via Gaussian penalties), supporting Bayesian moment-constraint uncertainty (Zhang et al., 2023), and to composite regularization objectives combining entropy with other priors (e.g., sparsity, quadratic derivatives, empirical data models) (Francis et al., 2019, Mak et al., 2017). This flexibility enables adaptation to structure-specific or domain-specific requirements—critical in fields such as observational cosmology, quantum inference, linguistics, and econometric modeling.

In macroeconomic and financial systems, MARS and entropy-based ensembles serve as scenario generators for stress-testing systemic metrics against baseline periods, supporting regulatory and early-warning functions (Gangi et al., 2015, Hazan, 2018).


Maximum-entropy Reconstruction (MARS), through a blend of mathematical rigor and computational adaptability, provides a unifying principle and a suite of tractable algorithms for diverse inverse and inference problems. Its performance, generality, and theoretical guarantees rely crucially on the quality of constraints (or data), the choice of default model, and proper regularization. These aspects are actively investigated in recent theoretical and applied work (Hitomi et al., 4 Apr 2025, Cha et al., 2022, Andreychenko et al., 2014, Mak et al., 2017, Biswas et al., 2010).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Maximum-entropy Reconstruction (MARS).