The sample complexity of weighted sparse approximation

July 24, 2015 ยท Declared Dead ยท ๐Ÿ› IEEE Transactions on Signal Processing

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Bubacarr Bah, Rachel Ward arXiv ID 1507.06736 Category math.NA: Numerical Analysis Cross-listed cs.CC, cs.IT, math.FA, stat.CO Citations 17 Venue IEEE Transactions on Signal Processing Last Checked 2 months ago
Abstract
For Gaussian sampling matrices, we provide bounds on the minimal number of measurements $m$ required to achieve robust weighted sparse recovery guarantees in terms of how well a given prior model for the sparsity support aligns with the true underlying support. Our main contribution is that for a sparse vector ${\bf x} \in \mathbb{R}^N$ supported on an unknown set $\mathcal{S} \subset \{1, \dots, N\}$ with $|\mathcal{S}|\leq k$, if $\mathcal{S}$ has \emph{weighted cardinality} $ฯ‰(\mathcal{S}) := \sum_{j \in \mathcal{S}} ฯ‰_j^2$, and if the weights on $\mathcal{S}^c$ exhibit mild growth, $ฯ‰_j^2 \geq ฮณ\log(j/ฯ‰(\mathcal{S}))$ for $j\in\mathcal{S}^c$ and $ฮณ> 0$, then the sample complexity for sparse recovery via weighted $\ell_1$-minimization using weights $ฯ‰_j$ is linear in the weighted sparsity level, and $m = \mathcal{O}(ฯ‰(\mathcal{S})/ฮณ)$. This main result is a generalization of special cases including a) the standard sparse recovery setting where all weights $ฯ‰_j \equiv 1$, and $m = \mathcal{O}\left(k\log\left(N/k\right)\right)$; b) the setting where the support is known a priori, and $m = \mathcal{O}(k)$; and c) the setting of sparse recovery with prior information, and $m$ depends on how well the weights are aligned with the support set $\mathcal{S}$. We further extend the results in case c) to the setting of additive noise. Our results are {\em nonuniform} that is they apply for a fixed support, unknown a priori, and the weights on $\mathcal{S}$ do not all have to be smaller than the weights on $\mathcal{S}^c$ for our recovery results to hold.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Numerical Analysis

R.I.P. ๐Ÿ‘ป Ghosted

Tensor Ring Decomposition

Qibin Zhao, Guoxu Zhou, ... (+3 more)

math.NA ๐Ÿ› arXiv ๐Ÿ“š 427 cites 9 years ago

Died the same way โ€” ๐Ÿ‘ป Ghosted