The geometry of efficient codes: how rate-distortion trade-offs distort the latent representations of generative models
June 11, 2024 ยท Declared Dead ยท ๐ PLoS Comput. Biol.
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Leo D'Amato, Gian Luca Lancia, Giovanni Pezzulo
arXiv ID
2406.07269
Category
q-bio.NC
Cross-listed
cs.IT
Citations
3
Venue
PLoS Comput. Biol.
Last Checked
2 months ago
Abstract
Living organisms rely on internal models of the world to act adaptively. These models, because of resource limitations, cannot encode every detail and hence need to compress information. From a cognitive standpoint, information compression can manifest as a distortion of latent representations, resulting in the emergence of representations that may not accurately reflect the external world or its geometry. Rate-distortion theory formalizes the optimal way to compress information while minimizing such distortions, by considering factors such as capacity limitations, the frequency and the utility of stimuli. However, while this theory explains why the above factors distort latent representations, it does not specify which specific distortions they produce. To address this question, here we investigate how rate-distortion trade-offs shape the latent representations of images in generative models, specifically Beta Variational Autoencoders ($ฮฒ$-VAEs), under varying constraints of model capacity, data distributions, and task objectives. By systematically exploring these factors, we identify three primary distortions in latent representations: prototypization, specialization, and orthogonalization. These distortions emerge as signatures of information compression, reflecting the model's adaptation to capacity limitations, data imbalances, and task demands. Additionally, our findings demonstrate that these distortions can coexist, giving rise to a rich landscape of latent spaces, whose geometry could differ significantly across generative models subject to different constraints. Our findings contribute to explain how the normative constraints of rate-distortion theory shape the geometry of latent representations of generative models of artificial systems and living organisms.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ q-bio.NC
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
SuperSpike: Supervised learning in multi-layer spiking neural networks
R.I.P.
๐ป
Ghosted
Generic decoding of seen and imagined objects using hierarchical visual features
R.I.P.
๐ป
Ghosted
Convolutional Neural Networks as a Model of the Visual System: Past, Present, and Future
R.I.P.
๐ป
Ghosted
A probabilistic atlas of the human thalamic nuclei combining ex vivo MRI and histology
R.I.P.
๐ป
Ghosted
Neural network models and deep learning - a primer for biologists
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted