Characterize activation functions that are γ-parameter bounding
Determine the class of activation functions beyond the Rectified Linear Unit (ReLU) and the Heaviside step function that are γ-parameter bounding, meaning they permit universal approximation by single-hidden-layer feed-forward neural networks in which every individual scalar parameter (each weight and bias) is bounded within the interval [-γ, γ] for a given γ > 0, with approximation measured in an L1 norm over compact input domains.
References
We leave it to future work to determine which other activations are parameter bounding.
— Expressivity of Neural Networks with Random Weights and Learned Biases
(2407.00957 - Williams et al., 2024) in Section 2.1 (Feed-forward neural networks), after Proposition \ref{prop:relu-parambound}