Papers
Topics
Authors
Recent
Search
2000 character limit reached

Fixed-Frequency Fourier Features

Updated 7 February 2026
  • Fixed-frequency Fourier features are deterministic mappings using fixed sinusoidal basis functions that enable structured high-dimensional representations.
  • They facilitate efficient kernel approximations, time-frequency neural encodings, and signal processing by aligning frequency grids with domain-specific properties.
  • Applications span neural network positional encoding, time series decomposition, and performance improvements in reconstruction and training acceleration.

Fixed-frequency Fourier features refer to deterministic mappings from input coordinates or signals to high-dimensional spaces via explicit sinusoidal functions at fixed frequencies. These features serve as structured, periodic basis elements useful for a range of machine learning tasks, including kernel approximation, neural network positional encoding, and efficient signal representation. Fixed-frequency encodings contrast with random Fourier features (RFF) in that their frequency grid is predetermined and typically chosen to match analytic or empirical properties of the data or domain.

1. Mathematical Formulation of Fixed-Frequency Fourier Features

Let xRdx\in \mathbb{R}^d. A general fixed-frequency Fourier feature map takes the form: ϕ(x)=[sin(ω1x), cos(ω1x), ,sin(ωmx), cos(ωmx)],\phi(x) = \left[ \sin(\omega_1^\top x),\ \cos(\omega_1^\top x),\ \ldots, \sin(\omega_m^\top x),\ \cos(\omega_m^\top x) \right]^\top, where {ωi}i=1m\{\omega_i\}_{i=1}^m is a deterministic, typically structured collection of frequency vectors. In one-dimensional settings, ωk=kπ\omega_k = k \pi is a standard choice, producing an ordered set of harmonics. For multivariate periodic kernels, structured index sets IZdI\subset\mathbb{Z}^d (tensor-grid, total degree, hyperbolic cross, etc.) are chosen, and ϕI(x)=[αscos(2πksx),αssin(2πksx)]s=1I\phi_I(x) = [\sqrt{\alpha_s}\cos(2\pi k_s^\top x), \sqrt{\alpha_s}\sin(2\pi k_s^\top x)]_{s=1}^{|I|}, with αs\alpha_s determined by the kernel's Fourier spectrum (Tompkins et al., 2018).

2. Basis Expansions and Time-Frequency Representations

Within signal processing, the fixed-frequency expansion provides both time and frequency information. Let x[n], n=0,,T1x[n],\ n=0,\dots,T-1 be a signal; the basis functions φkcos(n)=cos(2πkn/T)\varphi_k^{\cos}(n) = \cos(2\pi k n / T) and φksin(n)=sin(2πkn/T)\varphi_k^{\sin}(n) = \sin(2\pi k n / T) for k=0,,T/2k = 0,\ldots,T/2 (or relevant partial sums) constitute an orthogonal set spanning real signals. The mapping is explicit: x[n]=(1/T)HR[0]+(1/T)HR[T/2]φT/2+(2/T)k=1T/21(HR[k]φkcos(n)HI[k]φksin(n)),x[n] = (1/T)H_R[0] + (1/T)H_R[T/2]\varphi^*_{T/2} + (2/T)\sum_{k=1}^{T/2-1} \Big(H_R[k]\varphi_k^{\cos}(n) - H_I[k]\varphi_k^{\sin}(n)\Big), with HR[k],HI[k]H_R[k], H_I[k] the DFT coefficients (Yang et al., 13 Jul 2025). These bases serve as foundational components in time-frequency neural representations such as Fourier Basis Mapping (FBM).

3. Applications in Neural Networks and Kernel Approximation

Fixed-frequency Fourier features are widely deployed in several contexts:

  • Positional Encoding: In neural fields, coordinates are mapped via multiscale sinusoidal embeddings, e.g., γ(x)=[sin(2πv1x),cos(2πv1x),,sin(2πvLx),cos(2πvLx)]\gamma(x) = [\sin(2\pi v_1 x), \cos(2\pi v_1 x), \ldots, \sin(2\pi v_L x), \cos(2\pi v_L x)], where vi=2i1v_i = 2^{i-1} to cover exponentially increasing bandwidth. This approach provides a continuous, periodic, and multiresolution encoding tailored for MLPs and NeRF-type architectures (Lee et al., 2022).
  • Kernel Methods: Index-Set Fourier Series Features (ISFSF) achieve deterministic, spectrally controlled kernel approximations. Selection of the frequency set II allows direct control of the truncation error and computational complexity, outperforming RFF on periodic or smooth kernels (Tompkins et al., 2018).
  • Tabular Data and Deep Learning: Fixed-frequency random (yet fixed at initialization) Fourier mappings, as parameter-free pre-processors, condition the network’s Neural Tangent Kernel (NTK) and accelerate training (Sergazinov et al., 3 Jun 2025).
  • Time Series: FBM applies an explicit time-frequency feature map using complete (tiered) Fourier bases, supporting plug-and-play integration into different neural architectures (linear, MLP, Transformer), and specialized decomposition into seasonal, trend, and interaction effects (Yang et al., 13 Jul 2025).

4. Structured Design of Frequency Grids

The choice of frequency set is critical for balancing expressivity and computational efficiency:

  • Grid-based Selection: Full tensor grids, total-degree sets (1\ell_1-ball), Euclidean (2\ell_2-ball), hyperbolic-cross, and energy-norm hyperbolic crosses are the principal constructions (Tompkins et al., 2018). Tensor-grid: IT(R)={k:kd<Rd}I_T(R)=\{\mathbf{k}: |k_d|<R_d\}. Total-degree: I1(R)={k:dkd<R}I_{\ell_1}(R)=\{\mathbf{k}: \sum_{d}|k_d| < R\}.
  • Multiscale Priors: Exponentially scaled frequencies vi=2i1v_i = 2^{i-1} establish hierarchical encoding for capturing both low- and high-frequency content efficiently (Lee et al., 2022).
  • Application-specific Grids: Choice is adapted to smoothness, periodicity, and dimensionality. For D3D\leq3, full or ball sets with moderate refinement are used. High DD favors more aggressive sparsification (hyperbolic-cross, ENHC).

Approximation error decays rapidly with increased coverage of the frequency spectrum, and deterministic selection allows for both accurate low-frequency modeling and the systematic inclusion of high-frequency details (Tompkins et al., 2018).

5. Algorithmic Implementations and Efficiency

Efficient computation of fixed-frequency features arises both in feature mapping and in direct computation of selected DFT components:

  • QFF (Quantized Fourier Features): Embedding is quantized along bins per sinusoidal component. For each input dimension, interpolation over MM bins provides local adaptivity while retaining global periodicity. Factorized quantization (QFF-3D) allows for explicit cross-dimensional interactions via bilinear or vector-matrix decompositions (Lee et al., 2022).
  • Partial DFT Algorithms: The Fast Partial Fourier Transform (PFT) computes arbitrary blocks of kk consecutive DFT coefficients in O(n+klogk)O(n+k\log k) time by polynomially approximating slowly oscillating twiddles, restructuring the Cooley–Tukey recursion, and leveraging small FFTs and matrix multiplications. This enables scalable selective spectral analysis or feature extraction (Park et al., 2020).
  • Neural Integration: Features are either concatenated with continuous sin/cos embeddings and fed to deep networks, or used as direct plug-ins to the input layer with subsequent projection (e.g., flattening F[n,k]F[n,k] in FBM, or patchwise feature projection for Transformer architectures) (Yang et al., 13 Jul 2025).

6. Empirical Performance, Theoretical Guarantees, and Regularization

Fixed-frequency Fourier feature architectures exhibit rapid convergence, spectral expressivity, and robustness:

  • Training Acceleration: Pre-processing with fixed-frequency features bounds and conditions the NTK, yielding lower condition numbers, faster error decay, and reduced sensitivity to hyperparameter tuning (Sergazinov et al., 3 Jun 2025).
  • Spectral Bias Mitigation: MLPs with standard positional encodings exhibit spectral bias favoring low-frequency content. Fixed-frequency layers—augmented with learnable diagonal gates—support direct, theoretically justified selection of relevant frequencies by implicit 1\ell_1-type regularization, extracting sparse and interpretable frequency supports (Jeong et al., 2024).
  • Quantitative Evaluation: For NIR and NeRF tasks, QFF-Lite improves PSNR by +0.9+0.9 to +1.5+1.5 dB over unaugmented MLPs with reduced parameter budgets, and QFF-3D nearly matches much larger explicit-grid models (TensoRF-VM) with far fewer parameters and faster convergence (Lee et al., 2022).
  • Time-Frequency Task Decomposition: FBM-S decomposes forecasts into seasonal, trend, and interaction blocks, systematically leveraging the time-frequency structure and multi-scale feature pooling for state-of-the-art performance in time series forecasting (Yang et al., 13 Jul 2025).

7. Limitations, Parameterization, and Practical Considerations

Limitations and guidelines for practical utilization include:

  • Interpolation-induced Discontinuities: Piecewise-linear interpolation in quantized schemes may introduce sharp derivative changes at bin boundaries, partially ameliorated by adding back the continuous sinusoid (Lee et al., 2022).
  • Memory Costs: Factorized quantizations (QFF-3D) grow as O(M2)O(M^2) in the number of bins, constraining scalability at very high quantization resolutions.
  • Expressive Capacity versus Overfitting: In sparse-view SDF reconstruction, the smooth bias of unquantized MLPs may outperform finely quantized features.
  • Hyperparameter Choices: For QFF, L6L\approx6 frequency levels, M128M\approx128 bins, N16N\approx 16 feature dimension, and depth adjusted according to factorization (deeper for QFF-Lite, shallower for QFF-3D) provide effective trade-offs (Lee et al., 2022). In diagonal-gated architectures, use M=2mM=2m to $4m+1$ and moderate 2\ell_2-weight decay (λ0.01\lambda \approx 0.01–$0.1$) (Jeong et al., 2024).

The use of deterministic, fixed-frequency Fourier features supplies a structured, interpretable basis for both data representation and learning, with theoretical and empirical advantages over random feature methods in regimes with prominent spectral structure, periodicity, or where explicit control over the bandwidth is vital. Key methodologies include QFF (Lee et al., 2022), ISFSF (Tompkins et al., 2018), partial Fourier transforms (Park et al., 2020), and advanced time-frequency neural frameworks including FBM (Yang et al., 13 Jul 2025).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Fixed-Frequency Fourier Features.