NoPeek: Information leakage reduction to share activations in distributed deep learning

August 20, 2020 ยท Declared Dead ยท ๐Ÿ› 2020 International Conference on Data Mining Workshops (ICDMW)

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Praneeth Vepakomma, Abhishek Singh, Otkrist Gupta, Ramesh Raskar arXiv ID 2008.09161 Category cs.LG: Machine Learning Cross-listed cs.DC, stat.ML Citations 104 Venue 2020 International Conference on Data Mining Workshops (ICDMW) Last Checked 2 months ago
Abstract
For distributed machine learning with sensitive data, we demonstrate how minimizing distance correlation between raw data and intermediary representations reduces leakage of sensitive raw data patterns across client communications while maintaining model accuracy. Leakage (measured using distance correlation between input and intermediate representations) is the risk associated with the invertibility of raw data from intermediary representations. This can prevent client entities that hold sensitive data from using distributed deep learning services. We demonstrate that our method is resilient to such reconstruction attacks and is based on reduction of distance correlation between raw data and learned representations during training and inference with image datasets. We prevent such reconstruction of raw data while maintaining information required to sustain good classification accuracies.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ‘ป Ghosted