Improved universal approximation with neural networks studied via affine-invariant subspaces of $L_2(\mathbb{R}^n)$
April 03, 2025 Β· Declared Dead Β· π Electronic Journal of Applied Mathematics
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Cornelia Schneider, Samuel Probst
arXiv ID
2504.02445
Category
math.FA
Cross-listed
cs.IT
Citations
0
Venue
Electronic Journal of Applied Mathematics
Last Checked
2 months ago
Abstract
We show that there are no non-trivial closed subspaces of $L_2(\mathbb{R}^n)$ that are invariant under invertible affine transformations. We apply this result to neural networks showing that any nonzero $L_2(\mathbb{R})$ function is an adequate activation function in a one hidden layer neural network in order to approximate every function in $L_2(\mathbb{R})$ with any desired accuracy. This generalizes the universal approximation properties of neural networks in $L_2(\mathbb{R})$ related to Wiener's Tauberian Theorems. Our results extend to the spaces $L_p(\mathbb{R})$ with $p>1$.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β math.FA
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Tables of the existence of equiangular tight frames
R.I.P.
π»
Ghosted
Approximation spaces of deep neural networks
R.I.P.
π»
Ghosted
Sampling Theorems for Shift-invariant Spaces, Gabor Frames, and Totally Positive Functions
R.I.P.
π»
Ghosted
Eldan's Stochastic Localization and the KLS Conjecture: Isoperimetry, Concentration and Mixing
R.I.P.
π»
Ghosted
Equivalence of approximation by convolutional neural networks and fully-connected networks
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted