Papers
Topics
Authors
Recent
Search
2000 character limit reached

Seasonal & Two-Interval Models

Updated 2 February 2026
  • Seasonal and Two-Interval Models are mathematical frameworks that represent time series with periodic or abrupt regime changes using continuous and piecewise-constant components.
  • They integrate trigonometric and state-space methodologies to model complex phenomena such as multi-seasonal cycles, long-memory effects, and regime-switching dynamics in diverse applications.
  • Key estimation and forecasting techniques, including maximum likelihood and Kalman filtering, are used to validate model identifiability and improve predictive performance in empirical studies.

Seasonal and two-interval models are mathematical and statistical frameworks for representing temporal processes whose underlying generating mechanisms change according to periodic or piecewise-constant seasonal regimes. These models are essential in econometrics, geostatistics, hydrology, epidemiology, and financial mathematics, permitting both the explicit characterization of seasonally or periodically varying dynamics and the capturing of abrupt regime changes (e.g., dry/wet season, high/low transmission periods). Modern approaches unify continuous (e.g., trigonometric, Fourier) and discrete (e.g., square-wave, piecewise-constant) seasonal components with flexible stochastic and dependence structures, extending to multivariate, spatio-temporal, and long-memory settings.

1. Mathematical Formulation of Seasonal and Two-Interval Models

A broad class of seasonal models relies on decomposing the time axis into regimes, either through smooth periodic functions or piecewise-constant “interval” structure. In the case of two-interval models, the period TT is divided into a “baseline” (low) and “peak” (high) subinterval: β(t)={βlow,t[0,Llow) βhigh,t[Llow,T) \beta(t) = \begin{cases} \beta_{\mathrm{low}}, & t \in [0, L_{\mathrm{low}})\ \beta_{\mathrm{high}}, & t \in [L_{\mathrm{low}}, T)\ \end{cases} with Llow+Lhigh=TL_{\mathrm{low}} + L_{\mathrm{high}} = T (Hridoy, 2024, Tang et al., 2017). This framework is prominent in compartmental epidemic models, where transitions (e.g., infection rates) or parameters may shift discretely or according to square-wave functions; analogous structures underlie “regime-switching” time series.

In harmonic (Fourier-based) seasonal models, the seasonal structure is captured by finite sums of sinusoidal components: Qt=k=1K[Akcos(2πkt/s)+Bksin(2πkt/s)]Q_t = \sum_{k=1}^{K} [A_k \cos(2\pi k t/s) + B_k \sin(2\pi k t/s)] where ss is the season length (Kitagawa, 2024). For systems with multiple or nested seasonalities (e.g., daily and annual), distinct blocks of trigonometric terms—with harmonics selected for identifiability and orthogonality—are included.

Long-memory and fractional-integration models employ season-dependent differencing operators, e.g., the S-periodic fractional difference (1LS)Dt(1-L^S)^{D_t}, with DtD_t periodic (Bensalma, 2018). The configuration and ordering of differencing and AR operators yield different cointegration and memory properties (see Section 4).

2. Model Classes and Identifiability

a. Trigonometric (Fourier) Seasonal Component Models

Models of the form

yt=μ+Tt+Qt()+εty_t = \mu + T_t + \sum_{\ell} Q^{(\ell)}_t + \varepsilon_t

incorporate multiple finite Fourier series blocks Qt()Q^{(\ell)}_t of periods ss_\ell (Kitagawa, 2024). Identifiability of the trigonometric coefficients is ensured by the orthogonality of basis functions over full cycles; when periods are nested (e.g., s2=ms1s_2 = m s_1), coincident frequencies are excluded from one block to prevent collinearity.

A state-space embedding collects the seasonal coefficients as part of the latent state vector. The transition and observation matrices are block-diagonal, with special structure for the seasonal blocks (either as random walks or rotation matrices for constant coefficients): xt=Fxt1+Gvt,yt=Htxt+wtx_t = F x_{t-1} + G v_t,\qquad y_t = H_t x_t + w_t where xtx_t includes trend, error, and all Fourier block coefficients. The identifiability and decoupling remain guaranteed when regressors are orthogonal (Kitagawa, 2024).

b. Piecewise-Constant and Two-Interval Models

Two-interval models specify time-dependent parameters or inputs as step functions over the period, such as transmission rates in SIR/SIRS models and mean-reversion levels in stochastic volatility models: β(t)={β1,tJ1 β2,tJ2\beta(t) = \begin{cases} \beta_{1}, & t \in J_1 \ \beta_{2}, & t \in J_2 \end{cases} with J1J_1, J2J_2 contiguous, disjoint intervals partitioning the period TT (Tang et al., 2017). This partitioning allows explicit operator and spectral-radius formulae for threshold and stability properties.

In the context of extreme value statistics and spatial extremes, the year is partitioned into contiguous “meteorological” seasons, with each interval modeled independently or with distinct dependence structures (Jurado et al., 2022).

c. Fractionally Differenced Periodic Models

Let SS denote seasonality and DtD_t a periodic sequence. The operator (1LS)Dt(1-L^S)^{D_t}, applied pre- or post-AR polynomial in univariate or multivariate representations, yields either:

  • PSFI-PAR: Each season s is integrated of order DsD_s, with no cointegration (Bensalma, 2018).
  • PAR-PSFI: All seasons cointegrated to the maximal memory DmaxD_{\max}, yielding a shared autocovariance decay rate.

Model form and operator ordering fundamentally determine the integration, cointegration, and asymptotic autocovariance structures.

3. Estimation, Forecasting, and Model Comparison

Parameter Estimation

Gaussian maximum likelihood or Kalman filter–based approaches are standard for trigonometric state-space models, leveraging orthogonality to decouple estimation of Fourier coefficients. AIC and cross-validation are routinely used for selecting the number of harmonics: AIC=2(θ^)+2(number of parameters)\text{AIC} = -2\ell(\hat\theta) + 2\cdot (\text{number of parameters}) where ()\ell(\cdot) is the Gaussian log-likelihood (Kitagawa, 2024).

In two-interval SIR/SIRS or SV models, explicit spectral properties or Riccati ODEs—solved via matrix exponentials or numerical integration—yield (i) reproduction numbers and (ii) closed-form characteristic functions for transitional or marginal distributions (Tang et al., 2017, Schneider et al., 2015).

Nonparametric and two-parameter forecasting models employ calibration and rolling-window selection to minimize explicit error criteria (e.g., sum of squared or absolute errors), bypassing classical decomposition (Kahouadji, 2022). Seasonal extensions apply identical nonseasonal forecasting rules independently to each periodic subseries.

Forecasting

Point and interval forecasts for state-space/Fourier models are obtained by propagating the filtered state with transition matrix FF: xt+ht=Fhxtt,yt+ht=Ht+hxt+htx_{t+h|t} = F^h x_{t|t},\qquad y_{t+h|t} = H_{t+h} x_{t+h|t} with predictive intervals from the forecast variance formula (Kitagawa, 2024).

In piecewise-constant SIR/SIRS models, deterministic and branching-process approximations allow explicit computation of time-dependent extinction/outbreak probabilities, while Monte Carlo simulation (e.g., CTMC with inhomogeneous rates) is used for stochastic path generation (Hridoy, 2024).

Skill assessment in spatial extremes is achieved by cross-validated quantile scores (QS) and Quantile Skill Index (QSI), comparing models that pool across sites and/or seasons (Jurado et al., 2022).

4. Long Memory, Cointegration, and Seasonal Fractional Differencing

Two distinct classes of seasonally fractionally differenced periodic processes exist, defined by the ordering of the periodic AR and fractional differencing operators (Bensalma, 2018):

  • PSFI–PAR: (1LS)Dt(1-L^S)^{D_t} applied to the series before periodic AR; each season integrated at its own rate DsD_s; no cointegration; seasonal covariances decay as jDs+Ds1j^{D_s+D_{s'}-1}.
  • PAR–PSFI: periodic AR applied, then (1LS)Dt(1-L^S)^{D_t} to innovations; entire vector process is I(Dmax)I(D_{\max}); S1S-1 cointegration relations if DsD_s are not all equal; all covariances decay at j2Dmax1j^{2D_{\max}-1}.

The autocovariance structure, integration order, and cross-seasonal memory are thus determined by model structure, with explicit distinction between “separate-integration” and “cointegrated” models.

5. Empirical Applications and Comparative Findings

Economic/Ecological Time Series

Empirical analyses demonstrate that flexible trigonometric models can outperform standard decomposition or ARIMA frameworks, especially where the seasonal window is long or multiple cycles are present:

  • For monthly CO₂, optimal K=7 harmonics sufficed, with AIC (≈1479) substantially lower than classical decomposition (AIC≈1568).
  • In Tokyo electricity (two seasonalities: daily and weekly), the two–Fourier-block model (K₁=12 for daily; K₂ optimal at 16, excluding harmonics coinciding with daily) clearly adapted to complex periodicities unresolvable by single-block models (Kitagawa, 2024).

Nonparametric two-parameter models—requiring no explicit seasonal decomposition—delivered lower sum of absolute errors and higher coverage than both Holt-Winters and ARIMA across education, sales, finance, and economy datasets (Kahouadji, 2022).

Stochastic Epidemic Models

Two-interval forcing in epidemic models robustly alters outbreak probability and asymptotic persistence. Peaks in high-transmission intervals can drive the basic reproduction number R0\mathcal{R}_0 above 1, even if the time-averaged value would suggest extinction. Explicit formulae for R0\mathcal{R}_0 as the spectral radius of the monodromy matrix or operator formalism confirm threshold dynamics and global stability properties (Tang et al., 2017, Hridoy, 2024).

Spatial Extremes

A two-interval seasonal approach—summer vs. winter—applied to precipitation maxima reveals sharp contrasts in the spatial range and smoothness of extremal dependence. The isotropic Brown-Resnick max-stable model fits summer convective rainfall well (ρsum4,896\rho^{\text{sum}} \approx 4,896–$22,993$ m, αsum0.40\alpha^{\text{sum}} \approx 0.40–$0.54$), but fails to capture extended dependence and skill for winter frontal rainfall, where fitted ρwin43,870\rho^{\text{win}}\approx 43,870–$53,228$ m and QSI is negative, suggesting model misspecification and prompting the need for anisotropic or non-stationary extensions (Jurado et al., 2022).

Stochastic Volatility

Multi-factor stochastic volatility models with seasonal drift (e.g., CIR/Heston with general periodic θj(t)\theta_j(t)) capture both volatility smile and time-dependent (seasonal) instantaneous correlations, especially relevant for term structures and calendar spread options in commodity futures (Schneider et al., 2015).

6. Practical Guidelines, Limitations, and Model Selection

  • For smooth seasonal patterns, select a small number of harmonics via information criterion; for multiple nested cycles, delete overlapping harmonics for identifiability (Kitagawa, 2024).
  • Use piecewise-constant/interval models (e.g., square-wave) when regime transition is abrupt or externally imposed (school terms, wet/dry seasons).
  • In fractional models, carefully consider operator ordering to determine whether separate per-season integration or global cointegration is appropriate (Bensalma, 2018).
  • Diagnostic tools: empirical extremal coefficients, cross-validated quantile scores (spatial extremes); Lyapunov functions, Poincaré maps (epidemic persistence).
  • Model extensions include wavelet-type or spline-based seasonal bases, anisotropic/non-stationary spatial dependencies, and stochastic seasonal amplitudes.
  • Limitation: finite Fourier models may require large KK or time-varying bases under transient/abrupt regime change; isotropic spatial models fail under strong anisotropy.

7. Summary Table: Key Model Structures and Distinctions

Model Class Formulation/Structure Seasonality Mechanism
Trigonometric/Fourier yt=k[Akcos()+Bksin()]y_t = \sum_{k}[A_k \cos(\cdot)+B_k \sin(\cdot)] Finite orthogonal basis
Two-Interval Piecewise β(t)=β1\beta(t)=\beta_1 on J1J_1, β2\beta_2 on J2J_2 Step function (square-wave)
Fractional Differencing (1LS)Dt(1-L^S)^{D_t} (pre/post-AR poly) Periodic long-memory
State-Space xt=Fxt1+Gvt;yt=Htxt+wtx_t = F x_{t-1} + G v_t; y_t = H_t x_t + w_t Latent seasonal states
Max-Stable (Spatial) Brown-Resnick w/ stepwise variogram Separate seasonal dependence

Seasonal and two-interval models thus comprise a unified but heterogeneous toolkit, supporting a broad spectrum of process types and structural complexities, with rigorous identifiability and interpretability, and empirically demonstrated gains in predictive skill and mechanistic clarity across fields (Kitagawa, 2024, Hridoy, 2024, Kahouadji, 2022, Tang et al., 2017, Jurado et al., 2022, Bensalma, 2018, Schneider et al., 2015).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Seasonal and Two-Interval Models.