Assignment Flows for Data Labeling on Graphs: Convergence and Stability
February 26, 2020 Β· Declared Dead Β· π Information Geometry
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Artjom Zern, Alexander Zeilmann, Christoph SchnΓΆrr
arXiv ID
2002.11571
Category
math.DS
Cross-listed
cs.NE,
math.OC
Citations
17
Venue
Information Geometry
Last Checked
2 months ago
Abstract
The assignment flow recently introduced in the J. Math. Imaging and Vision 58/2 (2017), constitutes a high-dimensional dynamical system that evolves on an elementary statistical manifold and performs contextual labeling (classification) of data given in any metric space. Vertices of a given graph index the data points and define a system of neighborhoods. These neighborhoods together with nonnegative weight parameters define regularization of the evolution of label assignments to data points, through geometric averaging induced by the affine e-connection of information geometry. Regarding evolutionary game dynamics, the assignment flow may be characterized as a large system of replicator equations that are coupled by geometric averaging. This paper establishes conditions on the weight parameters that guarantee convergence of the continuous-time assignment flow to integral assignments (labelings), up to a negligible subset of situations that will not be encountered when working with real data in practice. Furthermore, we classify attractors of the flow and quantify corresponding basins of attraction. This provides convergence guarantees for the assignment flow which are extended to the discrete-time assignment flow that results from applying a Runge-Kutta-Munthe-Kaas scheme for numerical geometric integration of the assignment flow. Several counter-examples illustrate that violating the conditions may entail unfavorable behavior of the assignment flow regarding contextual data classification.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β math.DS
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Linearly-Recurrent Autoencoder Networks for Learning Dynamics
R.I.P.
π»
Ghosted
Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions
R.I.P.
π»
Ghosted
Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces
R.I.P.
π»
Ghosted
From rate distortion theory to metric mean dimension: variational principle
R.I.P.
π»
Ghosted
Double variational principle for mean dimension
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted