Topological Message Passing Systems
- Topological message passing systems are advanced deep learning architectures that operate on combinatorial domains using incidence relations to capture higher-order interactions.
- They generalize traditional graph neural networks by incorporating simplicial, cellular, hypergraph, and combinatorial structures with boundary and coboundary operators.
- These systems offer enhanced expressive power and scalability for practical applications in bioinformatics, molecular modeling, and computer-aided design.
A topological message passing system is a general architectural paradigm in topological deep learning (TDL) that extends classical graph neural network (GNN) message passing to operate on richer combinatorial domains, including simplicial complexes, cellular complexes, hypergraphs, and combinatorial complexes. These systems are uniquely capable of modeling higher-order relations, respecting the combinatorial and algebraic topology of the underlying domain, and are defined operationally by propagating and aggregating signals across the incidence structure of topological cells (vertices, edges, faces, etc.) in a way determined by boundary and coboundary relations. This approach enables neural architectures to capture interactions such as loops, cavities, and group connectivity patterns that cannot be encoded by pairwise graph edges alone (Papillon et al., 2023).
1. Foundations and General Mathematical Framework
A topological message passing system operates on a combinatorial structure with cells of varying dimension, such as vertices (0-cells), edges (1-cells), faces (2-cells), etc. These domains are formalized through discrete topological objects—simplicial complexes, cellular complexes (CW-complexes), hypergraphs, or combinatorial complexes. A typical layer of a topological neural network (TNN) acts on an -cochain space:
where is the set of -cells and the feature dimension of -cells. Features are updated by aggregating messages passed along cell incidence relations defined by boundary (), coboundary (), lower adjacency (), and upper adjacency () operators, each of which encodes a different combinatorial neighborhood of an -cell (Papillon et al., 2023).
Formal Update Formula
For each -cell , its feature at layer is updated by a four-step process:
- Message computation: For each neighbor with ,
- Within-neighborhood aggregation:
- Between-neighborhood aggregation:
- Feature update:
Typical choices for include linear maps, MLPs, or attention mechanisms; for AGG, permutation-invariant operations (sum, mean, max) or attention-weighted aggregation; and for , nonlinearities or gated updates. This framework generalizes graph message passing to higher dimensions (Papillon et al., 2023).
2. Varieties of Topological Message Passing Architectures
A broad spectrum of architectures arises from selecting different combinatorial domains, neighborhoods, and aggregation mechanisms:
| Domain | Example Architectures | Neighborhoods | Aggregation | Complexity per Layer |
|---|---|---|---|---|
| Hypergraphs | HyperSAGE, AllSet, HNHN, UniGNN, EHNN | Boundary, coboundary | Sum/mean/attention | O( |
| Simplicial | HodgeNet, SNN, SCCONV, MPSN, SCoNe, SCNN, HSN | All four | Sum/attention | O( |
| Cellular | CWN, CAN | Boundary, coboundary | Sum/attention | O(∑_r |
| Combinatorial | HOAN | All, incl. adjacents | Multi-head attn. | O(∑_r |
- Hypergraph NNs typically alternate node and hyperedge message aggregation using incidence structure.
- Simplicial complex NNs (e.g., MPSN) use boundary, coboundary, and (upper/lower)-adjacency for k-cell messaging; spectral (Hodge Laplacian) and spatial aggregations are both supported (Papillon et al., 2023).
- Cellular and combinatorial complex NNs extend to arbitrary cell types or maximal flexibility with multi-neighborhood attention.
3. Theoretical Properties and Expressivity
Topological message passing systems can achieve strictly greater expressive power than graph-based GNNs bounded by the 1-Weisfeiler–Lehman (1-WL) test. For instance:
- Simplicial and cellular message passing networks typically surpass the 1-WL power and are in some cases universal for multiset or function approximation (e.g., AllSet, EHNN).
- Spectral (Hodge-based) TNNs using polynomial Chebyshev filters can efficiently handle higher-dimensional adjacency with polynomial complexity (Papillon et al., 2023).
- Residual and high-skip connections (e.g., in HSN, UniGCNII) mitigate oversmoothing, a phenomenon where deep GNNs collapse features (Papillon et al., 2023).
4. Computational Challenges and Optimization
Key trade-offs arise in the design of topological message passing architectures:
- Scalability: Hypergraph NNs achieve O(|V|+|E|) per-layer, while simplicial and cellular NNs' costs scale with higher-dimensional cell counts. Attention and higher-order aggregation increase computational overhead.
- Stability: Spectral TNNs inherit perturbation robustness from Hodge theory; spatial schemes handle dynamic complex rewiring more effectively.
- Over-Smoothing: Deep topological message passing networks are susceptible to feature collapse; architectural strategies such as skip connections are effective countermeasures (Papillon et al., 2023).
5. Unified Axiomatic and Relational Views
A unifying framework for topological message passing is given by modeling the underlying combinatorial structure as a relational structure:
- A relational structure consists of entities (cells) and designated relations (boundary, coboundary, adjacencies).
- Message passing is then an instantiation of a general relational update:
where each aggregates over relation using a learned function, allowing complete generalization of GNNs, simplicial, cellular, and higher-order NNs as special cases.
- This axiomatic approach enables extension of graph-theoretic sensitivity and expressivity results to topological message passing, including quantifiable oversquashing effects and rewiring strategies via influence matrices constructed from the relational graph (Taha et al., 6 Jun 2025).
6. Applications and Future Directions
Topological message passing systems have found application across domains where higher-order or non-pairwise interactions are present, including:
- Bioinformatics: Protein structure and function prediction leveraging higher-order groupings of amino acids.
- Molecule and materials science: Modeling of multi-atom bonds, supramolecular assemblies, gene regulation, and materials with complex unit cells.
- Computer-aided design: Mesh and solid model segmentation based on B-rep or cellular structures.
- Social and information networks: Dynamic analysis of group and subgroup connectivity (Papillon et al., 2023).
Open problems include:
- Establishing curated benchmarking datasets for higher-order domains.
- Theoretical development of combinatorial complex frameworks and their empirical evaluation relative to classical topology (simplicial/cellular complexes).
- Unifying notation, advancing scalability, and developing advanced regularization and self-supervised schemes adapted from GNNs.
- Understanding and mitigating over-smoothing and oversquashing in very deep topological architectures (Papillon et al., 2023, Taha et al., 6 Jun 2025).
7. Significance and Outlook
Topological message passing systems generalize the neural message passing paradigm from graphs to higher-order topological structures, enabling rigorous modeling of multi-way relationships with strong theoretical guarantees. These systems bring algebraic-topological priors into deep learning pipelines, which is essential for domains where phenomena such as cycles, holes, or group-wise connectivity are fundamental. Continued research is focusing on notation unification, optimization, and bridging with mainstream graph learning, to fully harness the representational capacity of topological approaches (Papillon et al., 2023).