Analog Physical Systems Can Exhibit Double Descent

November 21, 2025 ยท Declared Dead ยท ๐Ÿ› arXiv.org

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Sam Dillavou, Jason W Rocks, Jacob F Wycoff, Andrea J Liu, Douglas J Durian arXiv ID 2511.17825 Category cond-mat.dis-nn Cross-listed cs.LG Citations 0 Venue arXiv.org Last Checked 2 months ago
Abstract
An important component of the success of large AI models is double descent, in which networks avoid overfitting as they grow relative to the amount of training data, instead improving their performance on unseen data. Here we demonstrate double descent in a decentralized analog network of self-adjusting resistive elements. This system trains itself and performs tasks without a digital processor, offering potential gains in energy efficiency and speed -- but must endure component non-idealities. We find that standard training fails to yield double descent, but a modified protocol that accommodates this inherent imperfection succeeds. Our findings show that analog physical systems, if appropriately trained, can exhibit behaviors underlying the success of digital AI. Further, they suggest that biological systems might similarly benefit from over-parameterization.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” cond-mat.dis-nn

Died the same way โ€” ๐Ÿ‘ป Ghosted