Generalized Quadratic Embeddings for Nonlinear Dynamics using Deep Learning
November 01, 2022 ยท Declared Dead ยท ๐ Physica A: Statistical Mechanics and its Applications
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Pawan Goyal, Peter Benner
arXiv ID
2211.00357
Category
math.DS
Cross-listed
cs.LG
Citations
15
Venue
Physica A: Statistical Mechanics and its Applications
Last Checked
2 months ago
Abstract
The engineering design process often relies on mathematical modeling that can describe the underlying dynamic behavior. In this work, we present a data-driven methodology for modeling the dynamics of nonlinear systems. To simplify this task, we aim to identify a coordinate transformation that allows us to represent the dynamics of nonlinear systems using a common, simple model structure. The advantage of a common simple model is that customized design tools developed for it can be applied to study a large variety of nonlinear systems. The simplest common model -- one can think of -- is linear, but linear systems often fall short in accurately capturing the complex dynamics of nonlinear systems. In this work, we propose using quadratic systems as the common structure, inspired by the lifting principle. According to this principle, smooth nonlinear systems can be expressed as quadratic systems in suitable coordinates without approximation errors. However, finding these coordinates solely from data is challenging. Here, we leverage deep learning to identify such lifted coordinates using only data, enabling a quadratic dynamical system to describe the system's dynamics. Additionally, we discuss the asymptotic stability of these quadratic dynamical systems. We illustrate the approach using data collected from various numerical examples, demonstrating its superior performance with the existing well-known techniques.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ math.DS
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Linearly-Recurrent Autoencoder Networks for Learning Dynamics
R.I.P.
๐ป
Ghosted
Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions
R.I.P.
๐ป
Ghosted
Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces
R.I.P.
๐ป
Ghosted
From rate distortion theory to metric mean dimension: variational principle
R.I.P.
๐ป
Ghosted
Double variational principle for mean dimension
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted