Fixed-Frequency Fourier Features
- Fixed-frequency Fourier features are deterministic mappings using fixed sinusoidal basis functions that enable structured high-dimensional representations.
- They facilitate efficient kernel approximations, time-frequency neural encodings, and signal processing by aligning frequency grids with domain-specific properties.
- Applications span neural network positional encoding, time series decomposition, and performance improvements in reconstruction and training acceleration.
Fixed-frequency Fourier features refer to deterministic mappings from input coordinates or signals to high-dimensional spaces via explicit sinusoidal functions at fixed frequencies. These features serve as structured, periodic basis elements useful for a range of machine learning tasks, including kernel approximation, neural network positional encoding, and efficient signal representation. Fixed-frequency encodings contrast with random Fourier features (RFF) in that their frequency grid is predetermined and typically chosen to match analytic or empirical properties of the data or domain.
1. Mathematical Formulation of Fixed-Frequency Fourier Features
Let . A general fixed-frequency Fourier feature map takes the form: where is a deterministic, typically structured collection of frequency vectors. In one-dimensional settings, is a standard choice, producing an ordered set of harmonics. For multivariate periodic kernels, structured index sets (tensor-grid, total degree, hyperbolic cross, etc.) are chosen, and , with determined by the kernel's Fourier spectrum (Tompkins et al., 2018).
2. Basis Expansions and Time-Frequency Representations
Within signal processing, the fixed-frequency expansion provides both time and frequency information. Let be a signal; the basis functions and for (or relevant partial sums) constitute an orthogonal set spanning real signals. The mapping is explicit: with the DFT coefficients (Yang et al., 13 Jul 2025). These bases serve as foundational components in time-frequency neural representations such as Fourier Basis Mapping (FBM).
3. Applications in Neural Networks and Kernel Approximation
Fixed-frequency Fourier features are widely deployed in several contexts:
- Positional Encoding: In neural fields, coordinates are mapped via multiscale sinusoidal embeddings, e.g., , where to cover exponentially increasing bandwidth. This approach provides a continuous, periodic, and multiresolution encoding tailored for MLPs and NeRF-type architectures (Lee et al., 2022).
- Kernel Methods: Index-Set Fourier Series Features (ISFSF) achieve deterministic, spectrally controlled kernel approximations. Selection of the frequency set allows direct control of the truncation error and computational complexity, outperforming RFF on periodic or smooth kernels (Tompkins et al., 2018).
- Tabular Data and Deep Learning: Fixed-frequency random (yet fixed at initialization) Fourier mappings, as parameter-free pre-processors, condition the network’s Neural Tangent Kernel (NTK) and accelerate training (Sergazinov et al., 3 Jun 2025).
- Time Series: FBM applies an explicit time-frequency feature map using complete (tiered) Fourier bases, supporting plug-and-play integration into different neural architectures (linear, MLP, Transformer), and specialized decomposition into seasonal, trend, and interaction effects (Yang et al., 13 Jul 2025).
4. Structured Design of Frequency Grids
The choice of frequency set is critical for balancing expressivity and computational efficiency:
- Grid-based Selection: Full tensor grids, total-degree sets (-ball), Euclidean (-ball), hyperbolic-cross, and energy-norm hyperbolic crosses are the principal constructions (Tompkins et al., 2018). Tensor-grid: . Total-degree: .
- Multiscale Priors: Exponentially scaled frequencies establish hierarchical encoding for capturing both low- and high-frequency content efficiently (Lee et al., 2022).
- Application-specific Grids: Choice is adapted to smoothness, periodicity, and dimensionality. For , full or ball sets with moderate refinement are used. High favors more aggressive sparsification (hyperbolic-cross, ENHC).
Approximation error decays rapidly with increased coverage of the frequency spectrum, and deterministic selection allows for both accurate low-frequency modeling and the systematic inclusion of high-frequency details (Tompkins et al., 2018).
5. Algorithmic Implementations and Efficiency
Efficient computation of fixed-frequency features arises both in feature mapping and in direct computation of selected DFT components:
- QFF (Quantized Fourier Features): Embedding is quantized along bins per sinusoidal component. For each input dimension, interpolation over bins provides local adaptivity while retaining global periodicity. Factorized quantization (QFF-3D) allows for explicit cross-dimensional interactions via bilinear or vector-matrix decompositions (Lee et al., 2022).
- Partial DFT Algorithms: The Fast Partial Fourier Transform (PFT) computes arbitrary blocks of consecutive DFT coefficients in time by polynomially approximating slowly oscillating twiddles, restructuring the Cooley–Tukey recursion, and leveraging small FFTs and matrix multiplications. This enables scalable selective spectral analysis or feature extraction (Park et al., 2020).
- Neural Integration: Features are either concatenated with continuous sin/cos embeddings and fed to deep networks, or used as direct plug-ins to the input layer with subsequent projection (e.g., flattening in FBM, or patchwise feature projection for Transformer architectures) (Yang et al., 13 Jul 2025).
6. Empirical Performance, Theoretical Guarantees, and Regularization
Fixed-frequency Fourier feature architectures exhibit rapid convergence, spectral expressivity, and robustness:
- Training Acceleration: Pre-processing with fixed-frequency features bounds and conditions the NTK, yielding lower condition numbers, faster error decay, and reduced sensitivity to hyperparameter tuning (Sergazinov et al., 3 Jun 2025).
- Spectral Bias Mitigation: MLPs with standard positional encodings exhibit spectral bias favoring low-frequency content. Fixed-frequency layers—augmented with learnable diagonal gates—support direct, theoretically justified selection of relevant frequencies by implicit -type regularization, extracting sparse and interpretable frequency supports (Jeong et al., 2024).
- Quantitative Evaluation: For NIR and NeRF tasks, QFF-Lite improves PSNR by to dB over unaugmented MLPs with reduced parameter budgets, and QFF-3D nearly matches much larger explicit-grid models (TensoRF-VM) with far fewer parameters and faster convergence (Lee et al., 2022).
- Time-Frequency Task Decomposition: FBM-S decomposes forecasts into seasonal, trend, and interaction blocks, systematically leveraging the time-frequency structure and multi-scale feature pooling for state-of-the-art performance in time series forecasting (Yang et al., 13 Jul 2025).
7. Limitations, Parameterization, and Practical Considerations
Limitations and guidelines for practical utilization include:
- Interpolation-induced Discontinuities: Piecewise-linear interpolation in quantized schemes may introduce sharp derivative changes at bin boundaries, partially ameliorated by adding back the continuous sinusoid (Lee et al., 2022).
- Memory Costs: Factorized quantizations (QFF-3D) grow as in the number of bins, constraining scalability at very high quantization resolutions.
- Expressive Capacity versus Overfitting: In sparse-view SDF reconstruction, the smooth bias of unquantized MLPs may outperform finely quantized features.
- Hyperparameter Choices: For QFF, frequency levels, bins, feature dimension, and depth adjusted according to factorization (deeper for QFF-Lite, shallower for QFF-3D) provide effective trade-offs (Lee et al., 2022). In diagonal-gated architectures, use to $4m+1$ and moderate -weight decay (–$0.1$) (Jeong et al., 2024).
The use of deterministic, fixed-frequency Fourier features supplies a structured, interpretable basis for both data representation and learning, with theoretical and empirical advantages over random feature methods in regimes with prominent spectral structure, periodicity, or where explicit control over the bandwidth is vital. Key methodologies include QFF (Lee et al., 2022), ISFSF (Tompkins et al., 2018), partial Fourier transforms (Park et al., 2020), and advanced time-frequency neural frameworks including FBM (Yang et al., 13 Jul 2025).