Randomly Aggregated Least Squares for Support Recovery
March 16, 2020 · Declared Dead · 🏛 Signal Processing
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Ofir Lindenbaum, Stefan Steinerberger
arXiv ID
2003.07331
Category
math.ST
Cross-listed
cs.IT
Citations
11
Venue
Signal Processing
Last Checked
2 months ago
Abstract
We study the problem of exact support recovery: given an (unknown) vector $θ\in \left\{-1,0,1\right\}^D$, we are given access to the noisy measurement $$ y = Xθ+ ω,$$ where $X \in \mathbb{R}^{N \times D}$ is a (known) Gaussian matrix and the noise $ω\in \mathbb{R}^N$ is an (unknown) Gaussian vector. How small we can choose $N$ and still reliably recover the support of $θ$? We present RAWLS (Randomly Aggregated UnWeighted Least Squares Support Recovery): the main idea is to take random subsets of the $N$ equations, perform a least squares recovery over this reduced bit of information and then average over many random subsets. We show that the proposed procedure can provably recover an approximation of $θ$ and demonstrate its use in support recovery through numerical examples.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
📜 Similar Papers
In the same crypt — math.ST
R.I.P.
👻
Ghosted
R.I.P.
👻
Ghosted
An introduction to Topological Data Analysis: fundamental and practical aspects for data scientists
R.I.P.
👻
Ghosted
Minimax Optimal Procedures for Locally Private Estimation
R.I.P.
👻
Ghosted
Optimal Best Arm Identification with Fixed Confidence
R.I.P.
👻
Ghosted
Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees
R.I.P.
👻
Ghosted
User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient
Died the same way — 👻 Ghosted
R.I.P.
👻
Ghosted
Language Models are Few-Shot Learners
R.I.P.
👻
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
👻
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
👻
Ghosted