Embedding Capabilities of Neural ODEs
August 02, 2023 ยท Declared Dead ยท ๐ arXiv.org
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Christian Kuehn, Sara-Viola Kuntz
arXiv ID
2308.01213
Category
math.DS
Cross-listed
cs.NE
Citations
5
Venue
arXiv.org
Last Checked
2 months ago
Abstract
A class of neural networks that gained particular interest in the last years are neural ordinary differential equations (neural ODEs). We study input-output relations of neural ODEs using dynamical systems theory and prove several results about the exact embedding of maps in different neural ODE architectures in low and high dimension. The embedding capability of a neural ODE architecture can be increased by adding, for example, a linear layer, or augmenting the phase space. Yet, there is currently no systematic theory available and our work contributes towards this goal by developing various embedding results as well as identifying situations, where no embedding is possible. The mathematical techniques used include as main components iterative functional equations, Morse functions and suspension flows, as well as several further ideas from analysis. Although practically, mainly universal approximation theorems are used, our geometric dynamical systems viewpoint on universal embedding provides a fundamental understanding, why certain neural ODE architectures perform better than others.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ math.DS
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Linearly-Recurrent Autoencoder Networks for Learning Dynamics
R.I.P.
๐ป
Ghosted
Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions
R.I.P.
๐ป
Ghosted
Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces
R.I.P.
๐ป
Ghosted
From rate distortion theory to metric mean dimension: variational principle
R.I.P.
๐ป
Ghosted
Double variational principle for mean dimension
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted