Communication-Efficient Distributed Estimator for Generalized Linear Models with a Diverging Number of Covariates

January 17, 2020 Β· Declared Dead Β· πŸ› Computational Statistics & Data Analysis

πŸ‘» CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Ping Zhou, Zhen Yu, Jingyi Ma, Maozai Tian, Ye Fan arXiv ID 2001.06194 Category stat.ME Cross-listed cs.DC, cs.LG, stat.ML Citations 7 Venue Computational Statistics & Data Analysis Last Checked 2 months ago
Abstract
Distributed statistical inference has recently attracted immense attention. The asymptotic efficiency of the maximum likelihood estimator (MLE), the one-step MLE, and the aggregated estimating equation estimator are established for generalized linear models under the "large $n$, diverging $p_n$" framework, where the dimension of the covariates $p_n$ grows to infinity at a polynomial rate $o(n^Ξ±)$ for some $0<Ξ±<1$. Then a novel method is proposed to obtain an asymptotically efficient estimator for large-scale distributed data by two rounds of communication. In this novel method, the assumption on the number of servers is more relaxed and thus practical for real-world applications. Simulations and a case study demonstrate the satisfactory finite-sample performance of the proposed estimators.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” stat.ME

Died the same way β€” πŸ‘» Ghosted