Papers
Topics
Authors
Recent
Search
2000 character limit reached

Topological Message Passing Systems

Updated 26 January 2026
  • Topological message passing systems are advanced deep learning architectures that operate on combinatorial domains using incidence relations to capture higher-order interactions.
  • They generalize traditional graph neural networks by incorporating simplicial, cellular, hypergraph, and combinatorial structures with boundary and coboundary operators.
  • These systems offer enhanced expressive power and scalability for practical applications in bioinformatics, molecular modeling, and computer-aided design.

A topological message passing system is a general architectural paradigm in topological deep learning (TDL) that extends classical graph neural network (GNN) message passing to operate on richer combinatorial domains, including simplicial complexes, cellular complexes, hypergraphs, and combinatorial complexes. These systems are uniquely capable of modeling higher-order relations, respecting the combinatorial and algebraic topology of the underlying domain, and are defined operationally by propagating and aggregating signals across the incidence structure of topological cells (vertices, edges, faces, etc.) in a way determined by boundary and coboundary relations. This approach enables neural architectures to capture interactions such as loops, cavities, and group connectivity patterns that cannot be encoded by pairwise graph edges alone (Papillon et al., 2023).

1. Foundations and General Mathematical Framework

A topological message passing system operates on a combinatorial structure with cells of varying dimension, such as vertices (0-cells), edges (1-cells), faces (2-cells), etc. These domains are formalized through discrete topological objects—simplicial complexes, cellular complexes (CW-complexes), hypergraphs, or combinatorial complexes. A typical layer ℓ\ell of a topological neural network (TNN) acts on an rr-cochain space:

Cr={f : X(r)→Rdr}C_r = \{ f\,:\, X^{(r)} \to \mathbb{R}^{d_r} \}

where X(r)X^{(r)} is the set of rr-cells and drd_r the feature dimension of rr-cells. Features are updated by aggregating messages passed along cell incidence relations defined by boundary (∂r\partial_r), coboundary (δr\delta_r), lower adjacency (L↓,rL_{\downarrow, r}), and upper adjacency (L↑,rL_{\uparrow, r}) operators, each of which encodes a different combinatorial neighborhood of an rr-cell (Papillon et al., 2023).

Formal Update Formula

For each rr-cell xx, its feature at layer â„“+1\ell+1 is updated by a four-step process:

  1. Message computation: For each neighbor y∈Nk(x)y \in N_k(x) with k∈{∂,δ,↓,↑}k \in \{\partial, \delta, \downarrow, \uparrow\},

my→x(ℓ,k)=Mk(ℓ)(hx(ℓ),hy(ℓ),Θk(ℓ))m_{y \to x}^{(\ell, k)} = M_k^{(\ell)}(h_x^{(\ell)}, h_y^{(\ell)}, \Theta_k^{(\ell)})

  1. Within-neighborhood aggregation:

mx(ℓ,k)=AGGy∈Nk(x)my→x(ℓ,k)m_x^{(\ell, k)} = \mathrm{AGG}_{y \in N_k(x)} m_{y \to x}^{(\ell, k)}

  1. Between-neighborhood aggregation:

mx(ℓ)=AGGk∈Krmx(ℓ,k)with  Kr⊂{∂,δ,↓,↑}m_x^{(\ell)} = \mathrm{AGG}_{k \in K_r} m_x^{(\ell, k)} \quad \text{with}\; K_r \subset \{\partial, \delta, \downarrow, \uparrow\}

  1. Feature update:

hx(ℓ+1)=U(ℓ)(hx(ℓ),mx(ℓ),Φ(ℓ))h_x^{(\ell+1)} = U^{(\ell)}(h_x^{(\ell)}, m_x^{(\ell)}, \Phi^{(\ell)})

Typical choices for Mk(â„“)M_k^{(\ell)} include linear maps, MLPs, or attention mechanisms; for AGG, permutation-invariant operations (sum, mean, max) or attention-weighted aggregation; and for U(â„“)U^{(\ell)}, nonlinearities or gated updates. This framework generalizes graph message passing to higher dimensions (Papillon et al., 2023).

2. Varieties of Topological Message Passing Architectures

A broad spectrum of architectures arises from selecting different combinatorial domains, neighborhoods, and aggregation mechanisms:

Domain Example Architectures Neighborhoods Aggregation Complexity per Layer
Hypergraphs HyperSAGE, AllSet, HNHN, UniGNN, EHNN Boundary, coboundary Sum/mean/attention O(
Simplicial HodgeNet, SNN, SCCONV, MPSN, SCoNe, SCNN, HSN All four Sum/attention O(
Cellular CWN, CAN Boundary, coboundary Sum/attention O(∑_r
Combinatorial HOAN All, incl. adjacents Multi-head attn. O(∑_r
  • Hypergraph NNs typically alternate node and hyperedge message aggregation using incidence structure.
  • Simplicial complex NNs (e.g., MPSN) use boundary, coboundary, and (upper/lower)-adjacency for k-cell messaging; spectral (Hodge Laplacian) and spatial aggregations are both supported (Papillon et al., 2023).
  • Cellular and combinatorial complex NNs extend to arbitrary cell types or maximal flexibility with multi-neighborhood attention.

3. Theoretical Properties and Expressivity

Topological message passing systems can achieve strictly greater expressive power than graph-based GNNs bounded by the 1-Weisfeiler–Lehman (1-WL) test. For instance:

  • Simplicial and cellular message passing networks typically surpass the 1-WL power and are in some cases universal for multiset or function approximation (e.g., AllSet, EHNN).
  • Spectral (Hodge-based) TNNs using polynomial Chebyshev filters can efficiently handle higher-dimensional adjacency with polynomial complexity (Papillon et al., 2023).
  • Residual and high-skip connections (e.g., in HSN, UniGCNII) mitigate oversmoothing, a phenomenon where deep GNNs collapse features (Papillon et al., 2023).

4. Computational Challenges and Optimization

Key trade-offs arise in the design of topological message passing architectures:

  • Scalability: Hypergraph NNs achieve O(|V|+|E|) per-layer, while simplicial and cellular NNs' costs scale with higher-dimensional cell counts. Attention and higher-order aggregation increase computational overhead.
  • Stability: Spectral TNNs inherit perturbation robustness from Hodge theory; spatial schemes handle dynamic complex rewiring more effectively.
  • Over-Smoothing: Deep topological message passing networks are susceptible to feature collapse; architectural strategies such as skip connections are effective countermeasures (Papillon et al., 2023).

5. Unified Axiomatic and Relational Views

A unifying framework for topological message passing is given by modeling the underlying combinatorial structure as a relational structure:

  • A relational structure (S,R1,…,Rk)(\mathcal{S}, R_1, \ldots, R_k) consists of entities (cells) and designated relations (boundary, coboundary, adjacencies).
  • Message passing is then an instantiation of a general relational update:

hσ(t+1)=ϕ(t)(mσ,1(t),...,mσ,k(t))\mathbf{h}_\sigma^{(t+1)} = \boldsymbol{\phi}^{(t)}\left(\mathbf{m}^{(t)}_{\sigma,1}, ..., \mathbf{m}^{(t)}_{\sigma,k}\right)

where each mσ,i(t)\mathbf{m}^{(t)}_{\sigma,i} aggregates over relation RiR_i using a learned function, allowing complete generalization of GNNs, simplicial, cellular, and higher-order NNs as special cases.

  • This axiomatic approach enables extension of graph-theoretic sensitivity and expressivity results to topological message passing, including quantifiable oversquashing effects and rewiring strategies via influence matrices constructed from the relational graph (Taha et al., 6 Jun 2025).

6. Applications and Future Directions

Topological message passing systems have found application across domains where higher-order or non-pairwise interactions are present, including:

  • Bioinformatics: Protein structure and function prediction leveraging higher-order groupings of amino acids.
  • Molecule and materials science: Modeling of multi-atom bonds, supramolecular assemblies, gene regulation, and materials with complex unit cells.
  • Computer-aided design: Mesh and solid model segmentation based on B-rep or cellular structures.
  • Social and information networks: Dynamic analysis of group and subgroup connectivity (Papillon et al., 2023).

Open problems include:

  • Establishing curated benchmarking datasets for higher-order domains.
  • Theoretical development of combinatorial complex frameworks and their empirical evaluation relative to classical topology (simplicial/cellular complexes).
  • Unifying notation, advancing scalability, and developing advanced regularization and self-supervised schemes adapted from GNNs.
  • Understanding and mitigating over-smoothing and oversquashing in very deep topological architectures (Papillon et al., 2023, Taha et al., 6 Jun 2025).

7. Significance and Outlook

Topological message passing systems generalize the neural message passing paradigm from graphs to higher-order topological structures, enabling rigorous modeling of multi-way relationships with strong theoretical guarantees. These systems bring algebraic-topological priors into deep learning pipelines, which is essential for domains where phenomena such as cycles, holes, or group-wise connectivity are fundamental. Continued research is focusing on notation unification, optimization, and bridging with mainstream graph learning, to fully harness the representational capacity of topological approaches (Papillon et al., 2023).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Topological Message Passing System.