Are Neural Networks Collision Resistant?
September 24, 2025 Β· Declared Dead Β· π IACR Cryptology ePrint Archive
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Marco Benedetti, Andrej Bogdanov, Enrico M. Malatesta, Marc MΓ©zard, Gianmarco Perrupato, Alon Rosen, Nikolaj I. Schwartzbach, Riccardo Zecchina
arXiv ID
2509.20262
Category
cond-mat.dis-nn
Cross-listed
cs.CR,
math.PR
Citations
1
Venue
IACR Cryptology ePrint Archive
Last Checked
2 months ago
Abstract
When neural networks are trained to classify a dataset, one finds a set of weights from which the network produces a label for each data point. We study the algorithmic complexity of finding a collision in a single-layer neural net, where a collision is defined as two distinct sets of weights that assign the same labels to all data. For binary perceptrons with oscillating activation functions, we establish the emergence of an overlap gap property in the space of collisions. This is a topological property believed to be a barrier to the performance of efficient algorithms. The hardness is supported by numerical experiments using approximate message passing algorithms, for which the algorithms stop working well below the value predicted by our analysis. Neural networks provide a new category of candidate collision resistant functions, which for some parameter setting depart from constructions based on lattices. Beyond relevance to cryptography, our work uncovers new forms of computational hardness emerging in large neural networks which may be of independent interest.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β cond-mat.dis-nn
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Mutual Information, Neural Networks and the Renormalization Group
R.I.P.
π»
Ghosted
Machine learning meets network science: dimensionality reduction for fast and efficient embedding of networks in the hyperbolic space
R.I.P.
π»
Ghosted
Classification and Geometry of General Perceptual Manifolds
R.I.P.
π»
Ghosted
The jamming transition as a paradigm to understand the loss landscape of deep neural networks
R.I.P.
π»
Ghosted
Criticality in Formal Languages and Statistical Physics
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted