High-dimensional Location Estimation via Norm Concentration for Subgamma Vectors

February 05, 2023 Β· Declared Dead Β· πŸ› International Conference on Machine Learning

πŸ‘» CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Shivam Gupta, Jasper C. H. Lee, Eric Price arXiv ID 2302.02497 Category math.ST Cross-listed cs.IT, cs.LG, math.PR, stat.ML Citations 8 Venue International Conference on Machine Learning Last Checked 2 months ago
Abstract
In location estimation, we are given $n$ samples from a known distribution $f$ shifted by an unknown translation $Ξ»$, and want to estimate $Ξ»$ as precisely as possible. Asymptotically, the maximum likelihood estimate achieves the CramΓ©r-Rao bound of error $\mathcal N(0, \frac{1}{n\mathcal I})$, where $\mathcal I$ is the Fisher information of $f$. However, the $n$ required for convergence depends on $f$, and may be arbitrarily large. We build on the theory using \emph{smoothed} estimators to bound the error for finite $n$ in terms of $\mathcal I_r$, the Fisher information of the $r$-smoothed distribution. As $n \to \infty$, $r \to 0$ at an explicit rate and this converges to the CramΓ©r-Rao bound. We (1) improve the prior work for 1-dimensional $f$ to converge for constant failure probability in addition to high probability, and (2) extend the theory to high-dimensional distributions. In the process, we prove a new bound on the norm of a high-dimensional random variable whose 1-dimensional projections are subgamma, which may be of independent interest.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” math.ST

Died the same way β€” πŸ‘» Ghosted