Hierarchical Graph ODE (HiGO)
- HiGO is a hierarchical framework that integrates multi-level graph structures with Neural ODEs to capture continuous-time spatiotemporal dynamics.
- It employs adaptive, context-sensitive message passing to effectively fuse local details with global context for robust wildfire forecasting.
- Empirical evaluations reveal state-of-the-art performance, achieving higher Macro-F1 and AUPRC scores compared to conventional methods.
The Hierarchical Graph ODE (HiGO) is a machine learning framework for modeling multi-scale, continuous-time spatiotemporal dynamics, specifically designed for applications such as global wildfire activity prediction. HiGO integrates a multi-level graph hierarchy with context-sensitive, adaptive message-passing mechanisms and Neural ODE modules parameterized by graph neural networks (GNNs), enabling effective feature extraction and information fusion across spatial scales. This approach allows HiGO to represent the Earth system as a series of interconnected graph representations, each capturing progressively coarser contextual information, and to model the inherent continuous-time evolution of wildfire activity. Empirical evaluation on the SeasFire Cube dataset demonstrates that HiGO achieves state-of-the-art results, significantly outperforming point-wise, vision-based, and conventional graph-based baselines in long-range wildfire forecasting and continuous-time interpolation (Xu et al., 4 Jan 2026).
1. Multi-Level Graph Hierarchy
HiGO represents the Earth system as a pyramid of graphs :
- Level 1 (): A regular latitude–longitude grid, with nodes corresponding to individual grid cells and intra-level edges forming 4-connected neighborhoods.
- Coarser Levels (): Each coarse node aggregates a non-overlapping block of its children in , maintaining a 4-way adjacency structure at each resolution.
- Inter-Level Connections (): Explicit edges connect each node in level to its parent in level , enabling cross-scale information flow.
- Node/Edge Features: Each node carries a -dimensional feature vector ; edges carry scalar features .
This hierarchical construction addresses the multi-scale nature of wildfire phenomena, which are influenced by local conditions (fuel, moisture), regional weather, and global teleconnections. A single-scale graph cannot feasibly represent both fine-grained patterns and large-scale dependencies without excessive computational cost. HiGO's hierarchy facilitates both local detail and global context efficiently, supporting multi-scale feature fusion.
2. Adaptive Filtering Message Passing
HiGO employs context-aware adaptive message passing both within and across graph hierarchy levels:
Intra-Level Message Passing (Adaptive Message Passing, AdMP)
- Attention Computation: For node with neighbors in the same level , the raw attention score is computed as
where is a multilayer perceptron (MLP).
- Normalization: The attention coefficients are normalized over 's neighborhood:
- Message Aggregation and Node Update: The aggregated message is
and node features are updated via
Inter-Level Information Flow
- Downsampling (Fine to Coarse): For each coarse node pooling its children :
(normalized over children), and then
- Upsampling (Coarse to Fine): Each child ’s features are updated as
All attention coefficients and act as dynamic filters, adaptively gating information flow between nodes or across scales.
3. Neural ODE Parameterization and Continuous-Time Modeling
HiGO models continuous-time dynamics using neural ODEs parameterized by GNNs:
- Per-Level ODE: For each level ,
where is the set of all node features and a GNN-based message-passing function.
- Joint Multi-Level Integration: The complete hierarchical state evolves as
Level coupling occurs exclusively through inter-level edges in the graph, not via time in the ODE vector field.
- Initialization and Solvers: Initial conditions fuse driver variables, climate indices, and the current burned-area map. The ODE integration utilizes the adaptive Dormand–Prince (RK45, "dopri5") solver, trading off computational cost and accuracy by dynamically adjusting step size.
This continuous-time formulation enables precise forecasting and interpolation of wildfire activity, adapting to the inherently non-uniform temporal evolution of such processes.
4. Training Procedure and Loss Formalism
The HiGO framework is optimized for point-wise multi-class (ordinal) classification over burned-area intervals at each grid cell:
- Logits and Probabilities: For cell and class ,
- Loss Function: Weighted cross-entropy addresses severe class imbalance:
with weights inversely proportional to class frequency.
- Regularization: Standard weight decay is applied to .
- Forecast Scheduling: Forecast horizons are scheduled at 8, 16, ..., 48 days.
No auxiliary losses are required beyond this formulation.
5. Empirical Performance on Global Wildfire Forecasting
The SeaFire Cube dataset serves as the primary benchmarking corpus for HiGO:
| Model Type | Baseline Methods |
|---|---|
| Point-wise | MLP, XGBoost |
| Vision-based | U-Net, ViT, Swin-Transformer |
| Graph-based | GCN, NDCN (Neural Dynamic on Complex Networks), GraphCast |
Key evaluation metrics are Macro-F1 score (on the "fire" class) and area under precision-recall curve (AUPRC), computed using binary fire/no-fire labels.
- Short-term (8d): HiGO attains M-F1 = 0.581 versus GraphCast's 0.575, AUPRC = 0.653 versus 0.631.
- Long-range (48d): HiGO achieves M-F1 = 0.423 versus GraphCast's 0.384 (+3.9 points), AUPRC = 0.522 versus 0.474 (+4.8 points), with a consistently increasing margin at greater forecast horizons.
- Continuous-Time Interpolation: When trained on 16-day data, HiGO evaluated at 8d yields M-F1 = 0.551 (NDCN: 0.493, GraphCast: 0.484), and at 24d yields M-F1 = 0.507 (NDCN: 0.486, GraphCast: 0.459).
- Observational Consistency: The method produces robust, physically consistent, continuous-time predictions.
6. Discussion, Significance, and Generalization
HiGO's stable continuous-time predictions arise from its single GNN-ODE formulation of the system's vector field . This avoids the error-amplifying behavior of discrete recurrent schemes, and the adaptive-step ODE solvers automatically refine integration where system dynamics become stiff (e.g., rapid wildfire spread) while economizing computation in slowly evolving regimes.
Potential generalizations include application to any spatiotemporal phenomenon characterized by multi-scale coupling and continuous dynamics: examples include precipitation nowcasting, oceanic/atmospheric modeling, pollutant dispersion, epidemic modeling, and ecological invasion dynamics. The hierarchical pooling scheme could be replaced with alternative graph coarsening strategies for irregular domains, such as adaptive quadtrees or hemispherical meshes. Incorporation of physics-informed regularizers (e.g., enforcing conservation laws) into the ODE loss is another avenue for future research.
HiGO represents an overview of multi-level graph modeling, adaptive filtering message passing, and continuous-time neural ODE integration, enabling state-of-the-art performance on challenging continuous-time forecasting benchmarks in Earth science domains (Xu et al., 4 Jan 2026).