Papers
Topics
Authors
Recent
Search
2000 character limit reached

The Physics of Causation

Published 2 Jan 2026 in physics.hist-ph, physics.bio-ph, q-bio.MN, and q-bio.PE | (2601.00515v1)

Abstract: Assembly theory (AT) introduces a concept of causation as a material property, constitutive of a metrology of evolution and selection. The physical scale for causation is quantified with the assembly index, defined as the minimum number of steps necessary for a distinguishable object to exist, where steps are assembled recursively. Observing countable copies of high assembly index objects indicates that a mechanism to produce them is persistent, such that the object's environment builds a memory that traps causation within a contingent chain. Copy number and assembly index underlie the standardized metrology for detecting causation (assembly index), and evidence of contingency (copy number). Together, these allow the precise definition of a selective threshold in assembly space, understood as the set of all causal possibilities. This threshold demarcates life (and its derivative agential, intelligent and technological forms) as structures with persistent copies beyond the threshold. In introducing a fundamental concept of material causation to explain and measure life, AT represents a departure from prior theories of causation, such as interventional ones, which have so far proven incompatible with fundamental physics. We discuss how AT's concept of causation provides the foundation for a theory of physics where novelty, contingency and the potential for open-endedness are fundamental, and determinism is emergent along assembled lineages.

Summary

  • The paper introduces Assembly Theory, a framework that quantifies causation by calculating assembly indices as the minimum causal steps required for an object's formation.
  • It demonstrates that a high assembly index signals selective construction, establishing a measurable threshold to differentiate spontaneous formations from those arising from persistent causal mechanisms.
  • Empirical studies on molecular systems validate the theory, linking object copy number and assembly thresholds to the emergence of complexity, open-ended evolution, and potential biosignatures.

The Physics of Causation: A Formalization via Assembly Theory

Introduction

"The Physics of Causation" (2601.00515) proposes a radical re-conceptualization of causation as a material, physically quantifiable property, rather than an interventionist or counterfactual notion. Assembly Theory (AT), central to this work, introduces a substrate-specific topological space—assembly space—where the assembly index (AI) quantifies the minimum number of causal steps required for a physical object to exist. By formalizing the physical limits for spontaneous object formation, the work underpins a rigorous, quantitative framework with significant implications for the physics of emergence, life detection, and the characterization of open-ended evolution.

Assembly Theory: Foundations and Metrology

Assembly Theory reconceives objects as the terminus of recursively assembled causal chains, where causation is embodied in the joining operations that create objects from units. The assembly space encapsulates all possible constructions, with each object assigned an assembly index, aia_i, representing its minimal causal depth. The assembly threshold aMa_M demarcates the AI above which objects cannot form spontaneously in a given physical system, bounded by system size (NTN_T), measurement resolution (MM), and the combinatorial substrate branching factor (bb):

aM=1+lnMln((1B)NT+B)lnba_M = 1 + \frac{\ln M - \ln((1-B)N_T + B)}{\ln b}

where B=1/bB = 1/b in the absence of environmental selection. The notion of "copy number" (nin_i)—the countable abundance of an object—provides empirical evidence for persistent, contingent causal mechanisms, i.e., constructors.

To operationalize causation as a measurable physical attribute, the assembly index is shown to be invariant across distinct experimental modalities (mass spectrometry, IR, NMR), reflecting its observer-independence and compatibility with quantum mechanical descriptions. This enables the definition and application of a standardized, physically intrinsic scale for constructed complexity.

The Spontaneous-Selected Threshold and Detection of Life

By quantifying the exponential expansion of combinatorial possibilities in assembly space, AT predicts a universal, albeit substrate-dependent, abiotic upper bound on observable AI. Objects with ai<a1a_i < a_1 (ontological threshold) may arise spontaneously, but ai>a1a_i > a_1 necessitate selective construction via persistent causal lineages—an essential physical signature of life and intelligence. Empirical studies on molecular systems (e.g., Taxol, ribosome, ATP) corroborate the theory: no abiotically-derived molecule with ai>15a_i > 15 has been detected in molar-scale experiments, reflecting steep combinatorial suppression in the absence of selection.

Importantly, AT enables extrapolation from laboratory, to planetary, to cosmological scales; for instance, a molecule with ai>58a_i > 58 cannot arise abiotically even utilizing the observable universe's atom count. This provides a rigorous, system-size-dependent criterion for the detection of biosignature objects irrespective of molecular identity.

Materiality of Causation and the Assembly Space Ontology

Unlike counterfactual or interventionist accounts (e.g., Pearl's do-calculus), AT posits that causation is inherent in the material construction of objects, encoded in their AI and lineage. The assembly space, being both combinatorial and recursive, defines the universe of causal possibility without presupposing the fixed existence of all objects as in block-universe models.

Objects exist as endpoints of causal chains, and their repeated occurrence (high copy number) is direct evidentiary support for persistent contingency—selection that is quantifiable independent of environment. The concept of "virtual objects" formalizes the partially realized structures in the assembly space, while "assemblage" (AA) aggregates the total density of causal possibility (weighted by AI and copy number) in a system.

Contingency, Open-endedness, and Transition to Life

AT permits a rigorous account of contingency: an object's existence depends on the reproducibility of its causal ancestors, encoded in the AI and validated by copy number. The existence of lineages with cumulative AI and persistent copy number marks a phase transition: the system becomes "alive" when it supports a nontrivial burden of high-AI objects above the abiotic threshold.

Open-ended evolution is explicated by the expansion of assembly space through the stabilization of novel object types—novelty emerges from rare, persistent interactions in the under-resolved environment that cross the discretization threshold. Deterministic dynamics and predictability are emergent, not fundamental; the state-space expands noncomputably over time as new objects and operations are defined only retrospectively, invalidating precomputable causal closure.

Entropy, Path-dependence, and Comparison to Existing Paradigms

Assembly index is fundamentally distinct from entropy—AI is a non-extensive, algorithmic invariant of an individual object, whereas entropy is an ensemble property descriptive of state multiplicities. The work highlights the critical role of path-dependence in non-equilibrium systems and asserts that AT discards the unphysical assumption of ergodic sampling over unbounded ensembles. Instead, AT founds non-equilibrium path-dependency on precise, measurable causation and contingency, reconciling physical description with the empirical uncomputability of possibilities in complex systems.

Further, AT challenges the interpretation of possibility versus probability: the existence of an object is not governed by ensemble likelihood but by the presence of a causal mechanism—probability statements pertain to systems, not objects, reinforcing the substrate-centric, material notion of causation.

Selection, Phase Transitions, and Universality

The assembly, AA, condenses the cumulative selection and contingent causation in a system, providing a physically grounded parameter that marks the transition from abiotic matter to life (and further to intelligence and technology). The formalism admits precise, quantitative phase transitions, where a growing density of high-AI, high-copy objects signals the emergence (or extinction) of life-like behavior.

Prebiotic chemistry, often confounded with truly abiotic conditions, is shown to operate fundamentally within this selective regime, as human (or otherwise constructed) experimenter interventions effectively instantiate selective lineages.

Implications and Future Directions

By declaring the physical system-size and substrate as fundamental determinants of causal possibility, Assembly Theory yields a highly testable, resource-bound, and quantifiable framework. The formalization of material causation—distinct from existing interventionist accounts—enables the detection of living, technological, or potentially intelligent systems in principle via physical measurement alone.

The emergence of complexity, persistent selection, and dynamical "possibility expansion" are rendered consequences of real, measurable causation, not artifacts of fine-tuning or block-universe initial conditions. Future AI and astrobiology research may leverage AT in experimental life detection, understanding major evolutionary transitions, analyzing technological lineages, and elucidating the origins—and boundaries—of open-ended evolution.

Conclusion

"The Physics of Causation" establishes a rigorous, physically intrinsic account of causation, fundamentally distinct from extant formalizations. Assembly Theory introduces substrate-specific, quantifiable thresholds demarcating spontaneous from selected (and hence, living) objects based on assembly index and copy number. This formalism reconstitutes causation as a material property, underpinning a unified approach to the emergence of complexity, life, and intelligence, with broad implications for physical science, AI, and theories of evolution. The framework redefines the boundaries of what is possible and observable, offering a new lens for experimental and theoretical exploration.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Explain it Like I'm 14

A simple explanation of “The Physics of Causation”

What is this paper about?

This paper introduces a new way to think about “cause” in physics by treating it as a real, measurable feature of things in the world. The authors call this idea Assembly Theory (AT). AT gives us a tool to tell when an object (like a molecule or a gadget) needed a step‑by‑step construction process to exist, instead of just popping up by chance. Using AT, they argue we can draw a clear line between things that can form on their own and things that require life, intelligence, or technology to make them.

What questions are they asking?

  • Can we measure how much “causal work” it takes to make an object?
  • Is there a general rule that says when something is too complex to form by itself?
  • Can this rule help us tell if something was made by life (or intelligence) even if we don’t know the details of how it was made?

How do they study it?

They introduce a few simple ideas:

  • Assembly index: Think of building with LEGO or following a recipe. The assembly index of an object is the smallest number of steps needed to build it from basic parts, reusing sub‑assemblies whenever possible. Fewer steps means simpler; more steps means more constructed.
  • Copy number: How many identical copies of the object do you see? If you find lots of identical, complicated objects, that suggests there’s a reliable way (a “memory” or mechanism) that keeps making them—like a factory, a cell’s machinery, or a set of instructions.
  • Assembly space: Imagine a huge map of all the ways parts can be joined to make objects. Each step is a join, and objects are the endpoints of paths on this map. The map branches explosively because there are many ways to combine parts, like a choose‑your‑own‑adventure book with more and more options at each page.
  • Threshold idea (in plain terms): Because options grow so fast at each step, in any real, finite world (limited time, materials, and space), there’s a maximum build‑depth beyond which you won’t get many copies of objects unless something is actively selecting and guiding the process (like evolution or engineering). The authors back this up with math to show such a limit must exist.
  • Measurements: For molecules, they estimate assembly index using lab tools that break molecules and “read” their structure—mass spectrometry, NMR, and infrared spectroscopy. These very different tools give consistent results, which suggests assembly index is a real, intrinsic property of the molecule (like mass or charge), not just a measurement trick.

A helpful example from the paper is the molecule Taxol (from yew tree bark). The number of possible small molecules is astronomical. Randomly stumbling on Taxol in nature without a guided process is effectively impossible. But we find many copies of it in a specific tree because living cells have evolved a step‑by‑step mechanism to make it.

What did they find, and why does it matter?

  • There is a selective threshold: Below a certain assembly index, objects can appear “spontaneously” when conditions are right (they don’t need a special builder). Above that threshold, you only get objects if there’s a persistent, guiding mechanism that remembers how to make them (like evolution, a cell, or human technology).
  • Copies plus complexity are a signature of life or intelligence: Seeing many copies of a high‑assembly‑index object is strong evidence of a deep causal history—meaning selection has been at work along a lineage (biological, technological, or both).
  • The threshold doesn’t disappear even if you consider the whole universe: Even if you had all the atoms in the observable universe, there’s still a maximum build‑depth you’d expect to see without selection. Roughly speaking, if you ever found lots of copies of a molecule that requires “more than dozens of careful assembly steps,” that would be a strong sign of life or technology.
  • A new kind of “cause” that fits physics: Instead of defining causes as “what would happen if we intervened” (which is hard or impossible to test in many cases), AT defines causes as the actual joins along the shortest build path to an object. Cause becomes material: the object’s structure “remembers” the minimal steps it needed to exist.
  • A standardized measure: Because assembly index shows up consistently across very different instruments, it can serve as a metrology (a standard way to measure) for constructed complexity, much like temperature is standardized around absolute zero.

What could this mean for science and the future?

  • Life detection: Space missions could look for many copies of high‑assembly‑index molecules as a strong, general sign of life, even if alien life is very different from Earth’s. We don’t need to know the exact biology—just that “selective construction” happened.
  • Origin of new lineages: AT offers a way to study big transitions (like chemistry → biology, or biology → technology) by tracking when systems cross the threshold from spontaneous to selected construction.
  • Rethinking physics: AT suggests novelty (new things appearing), memory, and open‑ended creativity are fundamental features of nature. Determinism (things being predictable) may emerge along built lineages because the environment stores “how to build” information that keeps being reused.
  • Practical chemistry and technology: By quantifying how hard something is to build, AT can guide synthetic chemistry, materials science, and even design—pointing to when you’ll need special catalysts, machinery, or code to reliably make a complex object.

In short, the paper proposes a simple but powerful test: if you find many copies of a thing that takes many careful steps to build, you have strong, physics‑based evidence of selection at work—very likely life or intelligence. Assembly Theory turns that intuition into a measurable, testable science.

Knowledge Gaps

Knowledge gaps, limitations, and open questions

Below is a single, consolidated list of concrete gaps and open questions the paper leaves unresolved, intended to guide future research.

  • Formalize the derivation of the assembly threshold (Eq. 1) and expected copy-number distribution (Eq. 2) with clear assumptions, consistent notation, and full proofs; the current presentation is terse, contains typographical ambiguities, and does not show normalization or boundary conditions explicitly.
  • Empirically estimate and model the branching factor bb across substrates, object classes, and assembly depth, replacing the constant-bb simplification with substrate-specific bi(ai)b_i(a_i) and quantifying how bb scales with aia_i in real materials.
  • Develop methods to measure bb directly (or infer it robustly) from experimental data, including uncertainty quantification and sensitivity to sample preparation, reaction conditions, and instrument modality.
  • Provide a rigorous algorithmic framework to compute assembly index aia_i for arbitrary objects with guarantees on optimality (shortest path) and tractable complexity, including worst-case and average-case bounds for realistic object classes (e.g., molecules, polymers, morphologies, symbols).
  • Define canonical primitives UU and joins JJ for non-molecular substrates (e.g., languages, memes, morphologies, technological artifacts), and specify substrate-appropriate fragmentation operations that ensure consistent, reproducible construction of assembly spaces.
  • Clarify object equivalence and distinguishability: specify when stereoisomers, conformers, isotopologues, crystal polymorphs, or device variants constitute distinct object types for aia_i and nin_i, and how measurement resolution and environmental context affect object identity.
  • Establish cross-technique validation protocols demonstrating that IR, MS, and NMR yield consistent aia_i for the same object, including error bars, inter-instrument variability, calibration procedures, and handling of lossy projections and fragmentation biases.
  • Quantify the measurement resolution parameter MM (e.g., M10,000M \approx 10{,}000 for MS) across instruments and contexts, including ionization efficiency, matrix effects, suppression, dynamic range, and how MM maps onto true copy number nin_i in complex mixtures.
  • Distinguish stability-driven persistence from reproduction-driven persistence: develop criteria to separate high nin_i due to thermodynamic/kinetic stability (e.g., mineral crystals) from high nin_i due to selected constructive lineages, avoiding false positives in life detection.
  • Systematically test the abiotic threshold (aM15a_M \sim 15 for molar chemical samples) across diverse abiotic environments (geochemical, atmospheric, planetary analogs) and instrument platforms to assess robustness, generality, and conditions under which the threshold shifts.
  • Provide empirical case studies for borderline systems (e.g., Titan aerosols, interstellar chemistry, hydrothermal vents) to identify potential false positives/negatives in using high aia_i and high nin_i as biosignatures.
  • Integrate time explicitly: formalize the relationship between assembly time (forward steps aa) and assembly index aia_i in realistic processes with parallel, cyclic, or stochastic pathways; specify how kinetics, rates, and temporal resource constraints affect thresholds and observables.
  • Assess how finite reaction volumes, spatial distribution of mass, transport, and environmental heterogeneity at planetary/cosmological scales affect the extrapolation from laboratory NTN_T to planetary/cosmological bounds for a1a_1.
  • Develop a quantitative mapping between copy number nin_i and selection intensity along lineages, controlling for reaction network topology, autocatalysis, templating, and environmental memory, rather than treating nin_i as a generic proxy for contingency.
  • Formalize and empirically validate the concept of “virtual objects” and virtual copy number viv_i: operationalize how to infer latent causal structure from observed objects, and test predictions (e.g., compounding via eaie^{a_i}) against data.
  • Complete and validate the “Assembly” metric AA (Eq. 3–4): specify units, normalization, interpretability, computational procedure from experimental data, and demonstrate utility on real datasets (e.g., distinguishing abiotic vs. biotic assemblages).
  • Clarify DAG constraints in assembly paths: reconcile the directed acyclic graph definition with real-world cycles (e.g., recycling pathways, autocatalytic loops, manufacturing rework) and define how cycles are represented or broken in assembly spaces.
  • Address the relationship between AT’s material causation and existing causal frameworks (e.g., Pearl’s interventions, do-calculus, process theories): provide mappings, conditions under which AT reduces to or diverges from these frameworks, and empirical tests adjudicating differences.
  • Situate AT within thermodynamics and statistical mechanics: quantify energy, entropy, and resource costs associated with increasing aia_i; evaluate whether aia_i correlates with free-energy barriers or network complexity, and delineate when thermodynamics predicts or constrains assembly thresholds.
  • Provide a theory for microphysical causation assumptions (e.g., “no spontaneous generation of arbitrarily complex objects”) consistent with quantum fluctuations and rare-event statistics, including how AT eliminates Boltzmann brain pathologies without ad hoc fine-tuning.
  • Define and test the “selective threshold” operationally for life detection across scales (molecules → ribosomes → cells → artifacts): specify units, joins, measurement pipelines, and minimal datasets required to detect transitions in substrate (geo → bio → techno).
  • Create benchmark datasets and open toolchains for computing aia_i, nin_i, AA, and threshold bounds across substrates, enabling reproducibility, community validation, and comparative studies.
  • Evaluate sensitivity of aia_i to choice of primitives and joins: quantify how different reasonable choices (e.g., including vs. excluding hydrogen in molecular assembly) shift aia_i, and define canonical conventions to ensure cross-study comparability.
  • Investigate how environmental memory is stored and measured: develop metrics linking nin_i, half-life, and environmental constraints to memory depth, and distinguish memory embodied in objects from memory distributed across environments/constructors.
  • Extend AT predictions and thresholds to macroscopic technological systems (e.g., circuits, software, machines): define UU, JJ, fragmentation, and measurement proxies (e.g., version graphs, component libraries) and demonstrate predictive value for design, innovation, and lineage analysis.
  • Determine limits of computability of assembly spaces: characterize which substrates admit computable aia_i, under what approximations, and how to provide bounds or probabilistic estimates when exact computation is infeasible.
  • Provide statistical power analyses for detecting high-aia_i objects at low abundance: specify sample sizes, instrument settings, and analysis pipelines needed to confidently assert presence/absence relative to aMa_M in complex, noisy data.
  • Develop falsifiable predictions unique to AT (beyond existing complexity or causation measures), design experiments that could refute AT’s axioms (e.g., observing reproducible high-aia_i objects abiotically above predicted bounds), and articulate criteria for theoretical revision.

Practical Applications

Immediate Applications

The following applications can be deployed now using existing measurement techniques for molecular assembly index (via MS, IR, NMR) and copy number.

  • Biogenicity screening in analytical chemistry workflows (healthcare, environmental, forensics, food safety)
    • Use measured assembly index (ai) and instrument detection limits (M) to flag molecules above the abiotic threshold (e.g., aM ≈ 15 for molar-scale abiotic samples) as likely products of selection.
    • Workflow: sample prep → MS/IR/NMR fragmentation → automated assembly index computation → compare to calibrated aM for the sample size → triage/flagging → targeted identification.
    • Tools/products: “Assembly Index Analyzer” plugins for mass spectrometers; reporting dashboards integrating ai and copy number.
    • Dependencies/assumptions: correct definition of molecular primitives and composition rules; calibration of aM for system size NT and instrument resolution M; contamination controls; simple global branching factor b ≈ 25 is a first-order approximation.
  • Origin-of-life and prebiotic chemistry experiment control (academia)
    • Apply assembly index thresholds to discriminate genuine abiotic chemistry from inadvertent biological contamination in prebiotic simulations.
    • Workflow: regular assembly index audits of reaction products; exclude or analyze any species with ai > aM for the experiment’s NT and M.
    • Tools/products: standardized AT QA protocols for prebiotic labs; datasets of abiotic ai distributions under controlled conditions.
    • Dependencies/assumptions: robust instrument calibration; consistent fragmentation methods; threshold values may vary with substrate and conditions.
  • Retrosynthesis and synthesis planning heuristics (pharmaceuticals, fine chemicals, software)
    • Use assembly index as a generative description-length heuristic to identify reusable substructures and minimize the effective number of constructive joins in synthetic plans.
    • Workflow: compute ai for target; identify modular repeats; prioritize routes that exploit repeated motifs and reduce unique joins.
    • Tools/products: AT-enhanced retrosynthesis modules in cheminformatics packages; ranking metrics for route complexity.
    • Dependencies/assumptions: ai is not a direct proxy for reaction-step count; requires mapping between joins and chemically feasible steps; substrate-specific rules must be encoded.
  • Quality assurance in bioprocessing and fermentation (healthcare, industrial biotech)
    • Monitor the assembly index distribution of product streams to verify biological process health and detect off-target products or contamination.
    • Workflow: periodic MS profiling → ai distribution monitoring → anomaly detection when high-ai molecules appear/disappear unexpectedly.
    • Tools/products: inline MS with AT analytics; control charts tracking ai and copy number.
    • Dependencies/assumptions: stable baselines for process-specific ai distributions; sufficient copy numbers for detection; false positives from complex abiotic additives must be managed.
  • Environmental biosurveillance and source attribution (policy, public health)
    • Detect high-ai molecules in environmental samples to infer biological activity or industrial discharge requiring attention.
    • Workflow: field sampling → portable MS/IR → ai computation → geospatial mapping of high-ai copy-number hotspots → response prioritization.
    • Tools/products: portable “AT kits” for field labs; mapping platforms overlaying ai signatures with environmental data.
    • Dependencies/assumptions: instrument sensitivity in field conditions; abiotic complexity baselines for local environments; policy thresholds tied to validated risk models.
  • Astrobiology laboratory prototyping and mission payload pre-validation (space, academia)
    • Use AT thresholds to design and validate lab tests that emulate in situ life detection analyses for candidate payload instruments.
    • Workflow: simulate planetary sample sizes → test instrument M → verify ability to reject abiotic chemistry above aM and detect selected objects through copy number.
    • Tools/products: AT calibration protocols for flight-like instruments; benchmark datasets of abiotic and biotic mixtures.
    • Dependencies/assumptions: accurate scaling of aM with NT; planetary-relevant substrates; contamination safeguards.
  • Standardized reporting of constructed complexity in molecular databases (academia, software)
    • Annotate molecules with ai and measured copy-number metadata to enable comparative studies of constructed complexity across datasets.
    • Workflow: batch ai computation for libraries → attach ai fields to records → enable queries by complexity thresholds.
    • Tools/products: AT-enabled cheminformatics libraries; public datasets with ai labels.
    • Dependencies/assumptions: consistent primitives and rules across databases; algorithmic reproducibility of ai computation.
  • Forensic provenance analysis of chemical samples (forensics, policy)
    • Use the CN principle and ai to infer whether complex compounds likely arose from biological/technological lineages versus spontaneous abiotic processes.
    • Workflow: measure ai and copy number of key markers → compare against abiotic thresholds → combine with contextual evidence for provenance inference.
    • Tools/products: AT-based forensic protocols; admissible reporting standards.
    • Dependencies/assumptions: validated admissibility standards; known error rates; confounders (e.g., rare abiotic catalytic environments) must be addressed.
  • Instrument design and calibration to resolve AT thresholds (instrumentation, metrology)
    • Calibrate detection limits (M) and sampling volumes (NT) to ensure practical separation of abiotic and selected phases (visible as an abrupt cutoff near aM).
    • Workflow: instrument parameter sweeps → measure abiotic ai distributions → set operational M and NT for clear threshold discrimination.
    • Tools/products: AT calibration suites; reference materials with known ai.
    • Dependencies/assumptions: access to verified abiotic standards; inter-instrument ai invariance holds across modalities (MS, IR, NMR).
  • Education and training modules on material causation (education)
    • Integrate AT lab exercises into chemistry and physics curricula to teach causation as a measurable material property.
    • Workflow: student labs measuring ai via MS or IR for known molecules; compare across modalities to demonstrate invariance.
    • Tools/products: teaching kits; open-source ai calculators; curated molecule sets spanning ai ranges.
    • Dependencies/assumptions: access to instrumentation or high-quality datasets; simplified computational tools for classroom use.

Long-Term Applications

The following applications require further research, scaling, instrumentation development, or cross-substrate formalization before broad deployment.

  • In situ life detection on planetary missions using AT thresholds (space, policy)
    • Deploy instruments capable of measuring ai and copy number in planetary samples; classify molecules with ai above planetary/cosmological bounds (e.g., a1 ≈ 37 for Earth-scale systems, ≈ 58 cosmological) as definitive biosignatures/technosignatures.
    • Tools/products: flight-qualified AT-MS/IR payloads; onboard ai computation; mission data standards using AT criteria.
    • Dependencies/assumptions: robust measurement under extreme conditions; reliable estimation of NT in situ; contamination control; validated false-positive rates across exotic chemistries.
  • International metrology standards for assembly index (metrology, policy)
    • Establish assembly index as a standardized unit of constructed complexity with reference materials, protocols, and interlaboratory comparisons.
    • Tools/products: NMI-endorsed AT standards; certification programs; traceable reference substances with agreed ai values.
    • Dependencies/assumptions: consensus on primitives and composition rules; cross-modality invariance; sustained interlab reproducibility.
  • Cross-substrate assembly spaces (languages, software, memes, morphologies) for provenance and authenticity (software, education, cybersecurity)
    • Define units and joins for non-molecular substrates to measure ai and detect selected, lineage-dependent constructs (e.g., human-authored text vs random, genuine vs deepfake media).
    • Tools/products: AT provenance detectors for digital content; AI model audits using ai distribution signatures; plagiarism and counterfeit detection services.
    • Dependencies/assumptions: rigorous substrate-specific formalization; validated measurement pipelines; minimizing bias and adversarial gaming.
  • Manufacturability metrics and robotic assembly planning (manufacturing, robotics)
    • Use ai to quantify design complexity and derive minimal assembly sequences; integrate into digital twins to optimize assembly lines and robotic strategies.
    • Tools/products: AT planners for CAD/PLM; robot task optimizers minimizing effective joins; complexity-based cost estimators.
    • Dependencies/assumptions: mapping physical joins to formal AT joins; scalable measurement for multi-material products; industry acceptance and validation.
  • Ecosystem, biosphere, and city-level “Assemblage” metrics (policy, sustainability, urban planning)
    • Aggregate ai and copy numbers to compute A (assemblage) as a measure of the causal depth and selection embedded in a system; track innovation, resilience, and ecological health.
    • Tools/products: dashboards for A over time; policy targets linked to assemblage trajectories; comparative studies across regions.
    • Dependencies/assumptions: robust data collection across heterogeneous objects; defensible object-type definitions; causal interpretation frameworks.
  • Clinical diagnostics via complexity signatures (healthcare)
    • Exploit ai distributions in metabolomics/proteomics to detect disease states (e.g., cancer-specific complexity patterns), microbial infections, or treatment response.
    • Tools/products: AT-enabled diagnostic assays; decision-support integrating ai with conventional biomarkers.
    • Dependencies/assumptions: large-scale clinical validation; confounder control (diet, microbiome, medications); regulatory approvals.
  • Environmental regulation and planetary protection policies based on AT thresholds (policy)
    • Codify thresholds using ai and copy number to define presence of life-derived chemistry; inform cleanup standards and planetary protection protocols.
    • Tools/products: regulatory guidance on acceptable ai ranges in effluents; mission rules leveraging AT for contamination assessment.
    • Dependencies/assumptions: stakeholder consensus; clear enforcement frameworks; adaptive thresholds as science evolves.
  • Catalysis discovery and materials design via guided exploration of assembly space (materials, energy)
    • Use AT to navigate causal possibility spaces, identifying catalyst pathways that minimize effective joins and exploit reusable motifs.
    • Tools/products: AT-informed generative design platforms; catalyst screening prioritized by assembly-efficient pathways.
    • Dependencies/assumptions: high-fidelity mapping from assembly joins to reaction mechanisms; integrated computational–experimental loops.
  • Technosignature detection and SETI (space, astronomy)
    • Search for high-ai objects or spectral signatures with high copy numbers in remote sensing data as indicators of technology or life elsewhere.
    • Tools/products: AT feature extractors for astronomical spectra; survey pipelines flagging ai above cosmological bounds.
    • Dependencies/assumptions: ability to infer ai from remote, noisy data; reliable copy-number estimation at scale; astrophysical false-positive characterization.
  • IP valuation and R&D portfolio analytics based on constructed complexity (finance, corporate strategy)
    • Use ai and assemblage to quantify the causal depth embedded in technologies, informing valuation and strategic investment.
    • Tools/products: AT-based complexity indices for patents and product families; portfolio dashboards.
    • Dependencies/assumptions: demonstrated correlation with economic impact; substrate-specific formalization (software, hardware); acceptance by financial stakeholders.
  • Counterfeit detection in complex chemical products (pharmaceuticals, consumer goods)
    • Identify deviations in ai/copy-number profiles from authentic products to flag counterfeit or substandard items.
    • Tools/products: AT-based authenticity benchmarks; rapid test kits for supply chains.
    • Dependencies/assumptions: robust authentic baselines; resilience to batch variability; regulatory integration.
  • Education and public understanding of causation in physics (education, science communication)
    • Develop curricula and media that teach causation, contingency, and selection as material properties, fostering literacy in complexity science.
    • Tools/products: cross-disciplinary courses; interactive visualizations of assembly spaces; outreach programs.
    • Dependencies/assumptions: accessible measurement exemplars beyond molecules; educator training; alignment with standards.

Glossary

  • Abiotic upper bound: A theoretical maximum on what can form without biological or intelligent selection in assembly space. "a theoretical proof of an abiotic upper bound in assembly space"
  • Agential: Pertaining to agent-like forms (with goal-directed or agentive properties) derived from life. "agential, intelligent and technological forms"
  • Assembly index: The minimum number of recursive joining steps (causes) required for an object to exist. "assembly index, defined as the minimum number of steps necessary for a distinguishable object to exist"
  • Assembly space: The physical space of all possible objects and their causal joining pathways under substrate-specific rules. "assembly space, understood as the set of all causal possibilities."
  • Assembly Threshold Theorem: A result stating that a finite recursive causal threshold exists beyond which objects cannot arise without selection. "Assembly Threshold Theorem: A recursive causal threshold exists for all finite, combinatorial physical systems."
  • Assembly time: The number of discrete causal joining steps in a forward construction process for an object. "defining an assembly time a, as the number of discrete causal, constructive steps17."
  • Assemblage: The configuration of causal possibilities intrinsic to a collection of objects, used to quantify total selection. "We call the configuration of causal possibilities intrinsic to a collection of objects an assemblage"
  • Boltzmann brains: Hypothetical self-aware entities arising from random fluctuations, illustrating pathologies in current physics without causation. "(e.g., Boltzmann brains, and requiring fine-tuning)"
  • Branching factor: The number of new objects that can result from a single causal join at a given step, parameterizing combinatorial growth. "We assume a branching factor, b, to parameterize this growth"
  • Causal join: A substrate-allowed operation that combines two objects to produce a new one in assembly space. "where J(x, y, z) = 1 indicates the causal join can happen"
  • Chiral centres: Stereogenic atoms in a molecule that give rise to handedness and multiple stereoisomers. "and with 11 chiral centres there are 211 or 2048 possible stereoisomers,"
  • Constructor: A persistent mechanism (possibly widely distributed) that enables reliable production of objects along a causal chain. "a constructor, see e.g. von Neumann20 or Deutsch21"
  • Contingency: Dependence on prior, specific causal histories or stored memory for an object’s existence. "Deeper memory leads to contingency and allows the production of copies of more intricate objects"
  • Copy Number (CN) Principle: The principle that countable copies of an object are direct evidence of contingency in its construction. "Copy Number (CN) Principle. The existence of countable copies of a distinguishable object type is direct evidence of contingency along the causal chain mediating the object's construction."
  • Copy number: The count of identical, distinguishable objects of a given type in a sample. "Copy number, ni, is the countable number of each distinguishable object type, i, (defined up to limits of measurement)."
  • Cosmological bound: The upper assembly index limit for spontaneous formation even at the scale of the observable universe. "the cosmological bound is a1(1080) ~ 58"
  • Counterfactual definitions of cause: Approaches defining causation via hypothetical interventions or alternative scenarios. "This dissolves issues inherent to counterfactual definitions of cause"
  • De novo: A process occurring anew or from scratch without pre-existing templates. "zero growth of de novo complex, biological forms1."
  • Epistemological limit: A measurement-dependent bound determining what assembly indices are observable. "The parameter M quantifies am as an epistemological limit"
  • Ergodic sampling: Exploration of a space where all states are sampled given sufficient time; used here for chemical space. "and an ergodic sampling of chemical space."
  • Fragmentation: The process of recursively splitting objects into parts to infer assembly space and joins. "We define fragmentation as the process of splitting an object into two parts, e.g., z -> {x, y}"
  • Ground-state: The lowest-energy quantum state of a system, containing full structural information. "the complete quantum state (or equivalently its ground-state) contains all information"
  • Half-life: The characteristic time for half the amount of an object to decay or transform. "timescales longer than its natural half-life (timescale to decay)"
  • Hyper-objects: Objects with large causal extent or depth, whose lineages span many assembly steps. "are effectively hyper-objects as their assembly space, or lineage, is large in its causal depth."
  • Infrared (IR) spectroscopy: A technique probing molecular vibrations to infer structure and, here, assembly index. "like infrared (IR) spectroscopy, nuclear magnetic resonance (NMR) spectroscopy, and mass spectrometry (MS)48."
  • Intensive property: A property intrinsic to an object and independent of system size or measurement method. "Assembly index is an intensive property, subject to measurement"
  • Interventional ones: Prior causation theories based on interventions, argued to be incompatible with fundamental physics. "a departure from prior theories of causation, such as interventional ones,"
  • Mass spectrometry (MS): An analytical method that fragments molecules to measure mass-to-charge patterns, used to infer assembly index. "In a chemistry lab, mass spectrometry is a common technique for identifying molecules and measuring their properties49."
  • Metrology: The science of measurement; here, the standardized measurement framework for causation and complexity. "constitutive of a metrology of evolution and selection."
  • Microphysical causation: Causation at the level of microphysical processes that imposes a partial ordering on existence. "if microphysical causation is a real feature of our universe"
  • Nuclear magnetic resonance (NMR) spectroscopy: A technique probing nuclear spin environments to deduce molecular structure and assembly index. "like infrared (IR) spectroscopy, nuclear magnetic resonance (NMR) spectroscopy, and mass spectrometry (MS)48."
  • Ontological limit: The bound on existence (as opposed to measurability), specifying the maximum assembly index for any object to exist spontaneously. "the threshold a becomes an ontological limit, a1."
  • Ontological property: A property that is intrinsic to being or existence of objects, not dependent on observers or context. "complexity as an ontological property"
  • Planetary bound: The upper assembly index limit for spontaneous formation at the scale of Earth’s material resources. "the planetary bound is a1(1050) ~ 37"
  • Secondary metabolite: A biologically produced molecule not essential to primary metabolism, often complex (e.g., Taxol). "Taxol, a secondary metabolite with molecular formula C47H51N1O14,"
  • Selective threshold: The boundary in assembly space above which objects require selection and cannot form spontaneously. "These allow the precise definition of a selective threshold in assembly space"
  • Stereoisomers: Molecules with the same formula and connectivity but different spatial arrangements. "there are 211 or 2048 possible stereoisomers,"
  • Substrate-specific: Dependent on the material substrate; assembly spaces and joins vary by substrate. "Assembly spaces are substrate specific"
  • Virtual copy number: A compounded measure of causal depth reflecting collapsed branching along an object’s lineage. "the virtual copy number captures how the number of objects observed encodes a lineage of selected causation"
  • Virtual objects: Inferred structures within assembly space that lack autonomous existence but are implied by observed objects and fragmentation. "what we call 'virtual objects':"

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 55 likes about this paper.