Analog Physical Systems Can Exhibit Double Descent
November 21, 2025 ยท Declared Dead ยท ๐ arXiv.org
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Sam Dillavou, Jason W Rocks, Jacob F Wycoff, Andrea J Liu, Douglas J Durian
arXiv ID
2511.17825
Category
cond-mat.dis-nn
Cross-listed
cs.LG
Citations
0
Venue
arXiv.org
Last Checked
2 months ago
Abstract
An important component of the success of large AI models is double descent, in which networks avoid overfitting as they grow relative to the amount of training data, instead improving their performance on unseen data. Here we demonstrate double descent in a decentralized analog network of self-adjusting resistive elements. This system trains itself and performs tasks without a digital processor, offering potential gains in energy efficiency and speed -- but must endure component non-idealities. We find that standard training fails to yield double descent, but a modified protocol that accommodates this inherent imperfection succeeds. Our findings show that analog physical systems, if appropriately trained, can exhibit behaviors underlying the success of digital AI. Further, they suggest that biological systems might similarly benefit from over-parameterization.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ cond-mat.dis-nn
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Mutual Information, Neural Networks and the Renormalization Group
R.I.P.
๐ป
Ghosted
Machine learning meets network science: dimensionality reduction for fast and efficient embedding of networks in the hyperbolic space
R.I.P.
๐ป
Ghosted
Classification and Geometry of General Perceptual Manifolds
R.I.P.
๐ป
Ghosted
The jamming transition as a paradigm to understand the loss landscape of deep neural networks
R.I.P.
๐ป
Ghosted
Criticality in Formal Languages and Statistical Physics
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted