Papers
Topics
Authors
Recent
Search
2000 character limit reached

Vocabulary for Universal Approximation: A Linguistic Perspective of Mapping Compositions

Published 20 May 2023 in cs.LG, cs.NA, math.DS, and math.NA | (2305.12205v2)

Abstract: In recent years, deep learning-based sequence modelings, such as LLMs, have received much attention and success, which pushes researchers to explore the possibility of transforming non-sequential problems into a sequential form. Following this thought, deep neural networks can be represented as composite functions of a sequence of mappings, linear or nonlinear, where each composition can be viewed as a \emph{word}. However, the weights of linear mappings are undetermined and hence require an infinite number of words. In this article, we investigate the finite case and constructively prove the existence of a finite \emph{vocabulary} $V={\phi_i: \mathbb{R}d \to \mathbb{R}d | i=1,...,n}$ with $n=O(d2)$ for the universal approximation. That is, for any continuous mapping $f: \mathbb{R}d \to \mathbb{R}d$, compact domain $\Omega$ and $\varepsilon>0$, there is a sequence of mappings $\phi_{i_1}, ..., \phi_{i_m} \in V, m \in \mathbb{Z}+$, such that the composition $\phi{i_m} \circ ... \circ \phi_{i_1} $ approximates $f$ on $\Omega$ with an error less than $\varepsilon$. Our results demonstrate an unusual approximation power of mapping compositions and motivate a novel compositional model for regular languages.

Citations (2)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.