R.I.P.
π»
Ghosted
Decentralized learning with budgeted network load using Gaussian copulas and classifier ensembles
April 26, 2018 Β· Entered Twilight Β· π PKDD/ECML Workshops
"Last commit was 7.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: README.md, delco.py
Authors
John Klein, Mahmoud Albardan, Benjamin Guedj, Olivier Colot
arXiv ID
1804.10028
Category
stat.ML: Machine Learning (Stat)
Cross-listed
cs.AI,
cs.DC,
cs.LG
Citations
2
Venue
PKDD/ECML Workshops
Repository
https://github.com/john-klein/DELCO
β 3
Last Checked
2 months ago
Abstract
We examine a network of learners which address the same classification task but must learn from different data sets. The learners cannot share data but instead share their models. Models are shared only one time so as to preserve the network load. We introduce DELCO (standing for Decentralized Ensemble Learning with COpulas), a new approach allowing to aggregate the predictions of the classifiers trained by each learner. The proposed method aggregates the base classifiers using a probabilistic model relying on Gaussian copulas. Experiments on logistic regressor ensembles demonstrate competing accuracy and increased robustness in case of dependent classifiers. A companion python implementation can be downloaded at https://github.com/john-klein/DELCO
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning (Stat)
R.I.P.
π»
Ghosted
Distilling the Knowledge in a Neural Network
R.I.P.
π»
Ghosted
Layer Normalization
R.I.P.
π»
Ghosted
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
R.I.P.
π»
Ghosted
Domain-Adversarial Training of Neural Networks
R.I.P.
π»
Ghosted