Randomly Aggregated Least Squares for Support Recovery

March 16, 2020 · Declared Dead · 🏛 Signal Processing

👻 CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Ofir Lindenbaum, Stefan Steinerberger arXiv ID 2003.07331 Category math.ST Cross-listed cs.IT Citations 11 Venue Signal Processing Last Checked 2 months ago
Abstract
We study the problem of exact support recovery: given an (unknown) vector $θ\in \left\{-1,0,1\right\}^D$, we are given access to the noisy measurement $$ y = Xθ+ ω,$$ where $X \in \mathbb{R}^{N \times D}$ is a (known) Gaussian matrix and the noise $ω\in \mathbb{R}^N$ is an (unknown) Gaussian vector. How small we can choose $N$ and still reliably recover the support of $θ$? We present RAWLS (Randomly Aggregated UnWeighted Least Squares Support Recovery): the main idea is to take random subsets of the $N$ equations, perform a least squares recovery over this reduced bit of information and then average over many random subsets. We show that the proposed procedure can provably recover an approximation of $θ$ and demonstrate its use in support recovery through numerical examples.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

📜 Similar Papers

In the same crypt — math.ST

Died the same way — 👻 Ghosted