GBNL: Graded Betti Number Learning of Complex Biological Data
Abstract: While persistent homology is widely used for data shape analysis, persistent commutative algebra (PCA) has seen limited adoption in machine learning and data science. Unlike persistent homology, which delivers topological invariants in the form of Betti numbers, PCA provides both algebraic invariants and graded Betti numbers. However, graded Betti numbers have seldom been applied to real-world data. In this work, we introduce the first-of-its-kind application of commutative algebra graded Betti numbers in machine learning and data science. Specifically, we present Graded Betti Number Learning (GBNL) for protein-nucleic acid binding prediction. Protein-DNA/RNA interactions are fundamental to cellular processes such as replication, transcription, translation, and gene regulation, and their understanding and prediction remain challenging. GBNL represents each nucleic acid sequence as a family of $k$-mer-specific sets and derives persistent graded Betti invariants from PCA, generating multiscale topological representations of local nucleotide organization. To incorporate cross-molecule context, these graded Betti representations are paired with transformer-based protein embeddings, linking nucleotide-level signals with global protein patterns. The proposed graded Betti representations effectively detect single-site mutations and distinguish complete mutation patterns. Operating on primary sequences with minimal preprocessing, GBNL bridges commutative algebra, reduced algebraic topology, combinatorics, and machine learning, establishing a new paradigm for comparative sequence analysis. Numerical studies using three datasets highlight the success of GBNL in protein-nucleic acid binding prediction.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.