Effective dimensional reduction of complex systems based on tensor networks
November 20, 2024 Β· Declared Dead Β· π Journal of Physics: Complexity
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Wout Merbis, Madelon Geurts, ClΓ©lia de Mulatier, Philippe Corboz
arXiv ID
2411.13364
Category
cond-mat.stat-mech
Cross-listed
cs.SI,
physics.soc-ph
Citations
3
Venue
Journal of Physics: Complexity
Last Checked
2 months ago
Abstract
The exact treatment of Markovian models of complex systems requires knowledge of probability distributions exponentially large in the number of components $n$. Mean-field approximations provide an effective reduction in complexity of the models, requiring only a number of phase space variables polynomial in system size. However, this comes at the cost of losing accuracy close to critical points in the systems dynamics and an inability to capture correlations in the system. In this work, we introduce a tunable approximation scheme for Markovian spreading models on networks based on Matrix Product States (MPS). By controlling the bond dimensions of the MPS, we can investigate the effective dimensionality needed to accurately represent the exact $2^n$ dimensional steady-state distribution. We introduce the entanglement entropy as a measure of the compressibility of the system and find that it peaks just after the phase transition on the disordered side, in line with the intuition that more complex states are at the 'edge of chaos'. We compare the accuracy of the MPS with exact methods on different types of small random networks and with Markov Chain Monte Carlo methods for a simplified version of the railway network of the Netherlands with 55 nodes. The MPS provides a systematic way to tune the accuracy of the approximation by reducing the dimensionality of the systems state vector, leading to an improvement over second-order mean-field approximations for sufficiently large bond dimensions.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β cond-mat.stat-mech
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Unsupervised learning of phase transitions: from principal component analysis to variational autoencoders
π
π
Old Age
Unsupervised Generative Modeling Using Matrix Product States
R.I.P.
π»
Ghosted
Solving Statistical Mechanics Using Variational Autoregressive Networks
R.I.P.
π»
Ghosted
Learning Thermodynamics with Boltzmann Machines
R.I.P.
π»
Ghosted
Information Flows? A Critique of Transfer Entropies
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted