Randomized Kaczmarz Algorithm for Inconsistent Linear Systems: An Exact MSE Analysis
February 01, 2015 ยท Declared Dead ยท ๐ International Conference on Sampling Theory and Applications
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Chuang Wang, Ameya Agaskar, Yue M. Lu
arXiv ID
1502.00190
Category
math.NA: Numerical Analysis
Cross-listed
cs.IT,
math.OC
Citations
12
Venue
International Conference on Sampling Theory and Applications
Last Checked
2 months ago
Abstract
We provide a complete characterization of the randomized Kaczmarz algorithm (RKA) for inconsistent linear systems. The Kaczmarz algorithm, known in some fields as the algebraic reconstruction technique, is a classical method for solving large-scale overdetermined linear systems through a sequence of projection operators; the randomized Kaczmarz algorithm is a recent proposal by Strohmer and Vershynin to randomize the sequence of projections in order to guarantee exponential convergence (in mean square) to the solutions. A flurry of work followed this development, with renewed interest in the algorithm, its extensions, and various bounds on their performance. Earlier, we studied the special case of consistent linear systems and provided an exact formula for the mean squared error (MSE) in the value reconstructed by RKA, as well as a simple way to compute the exact decay rate of the error. In this work, we consider the case of inconsistent linear systems, which is a more relevant scenario for most applications. First, by using a "lifting trick", we derive an exact formula for the MSE given a fixed noise vector added to the measurements. Then we show how to average over the noise when it is drawn from a distribution with known first and second-order statistics. Finally, we demonstrate the accuracy of our exact MSE formulas through numerical simulations, which also illustrate that previous upper bounds in the literature may be several orders of magnitude too high.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Numerical Analysis
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations
R.I.P.
๐ป
Ghosted
PDE-Net: Learning PDEs from Data
R.I.P.
๐ป
Ghosted
Efficient tensor completion for color image and video recovery: Low-rank tensor train
R.I.P.
๐ป
Ghosted
Tensor Ring Decomposition
R.I.P.
๐ป
Ghosted
Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted