Ginzburg--Landau Functionals in the Large-Graph Limit
August 01, 2024 Β· Declared Dead Β· π arXiv.org
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Edith Zhang, James Scott, Qiang Du, Mason A. Porter
arXiv ID
2408.00422
Category
math.FA
Cross-listed
cs.SI,
math.CO,
math.PR
Citations
0
Venue
arXiv.org
Last Checked
2 months ago
Abstract
Ginzburg--Landau (GL) functionals on graphs, which are relaxations of graph-cut functionals on graphs, have yielded a variety of insights in image segmentation and graph clustering. In this paper, we study large-graph limits of GL functionals by taking a functional-analytic view of graphs as nonlocal kernels. For a graph $W_n$ with $n$ nodes, the corresponding graph GL functional $\GL^{W_n}_\ep$ is an energy for functions on $W_n$. We minimize GL functionals on sequences of growing graphs that converge to functions called graphons. For such sequences of graphs, we show that the graph GL functional $Ξ$-converges to a continuous and nonlocal functional that we call the \emph{graphon GL functional}. We also investigate the sharp-interface limits of the graph GL and graphon GL functionals, and we relate these limits to a nonlocal total-variation (TV) functional. We express the limiting GL functional in terms of Young measures and thereby obtain a probabilistic interpretation of the variational problem in the large-graph limit. Finally, to develop intuition about the graphon GL functional, we determine the GL minimizer for several example families of graphons.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β math.FA
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Tables of the existence of equiangular tight frames
R.I.P.
π»
Ghosted
Approximation spaces of deep neural networks
R.I.P.
π»
Ghosted
Sampling Theorems for Shift-invariant Spaces, Gabor Frames, and Totally Positive Functions
R.I.P.
π»
Ghosted
Eldan's Stochastic Localization and the KLS Conjecture: Isoperimetry, Concentration and Mixing
R.I.P.
π»
Ghosted
Equivalence of approximation by convolutional neural networks and fully-connected networks
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted