R.I.P.
π»
Ghosted
Robust and Accurate -- Compositional Architectures for Randomized Smoothing
April 01, 2022 Β· Entered Twilight Β· π arXiv.org
Repo contents: README.md, analysis, analyze_utils.py, architectures.py, architectures_macer.py, archs, certify_ace.py, certify_selection.py, data, datasets.py, predict_core.py, requirements.txt, scripts, smooth_ace.py, smooth_selection.py
Authors
MiklΓ³s Z. HorvΓ‘th, Mark Niklas MΓΌller, Marc Fischer, Martin Vechev
arXiv ID
2204.00487
Category
cs.LG: Machine Learning
Cross-listed
cs.AI,
cs.CR
Citations
14
Venue
arXiv.org
Repository
https://github.com/eth-sri/aces
β 3
Last Checked
2 months ago
Abstract
Randomized Smoothing (RS) is considered the state-of-the-art approach to obtain certifiably robust models for challenging tasks. However, current RS approaches drastically decrease standard accuracy on unperturbed data, severely limiting their real-world utility. To address this limitation, we propose a compositional architecture, ACES, which certifiably decides on a per-sample basis whether to use a smoothed model yielding predictions with guarantees or a more accurate standard model without guarantees. This, in contrast to prior approaches, enables both high standard accuracies and significant provable robustness. On challenging tasks such as ImageNet, we obtain, e.g., $80.0\%$ natural accuracy and $28.2\%$ certifiable accuracy against $\ell_2$ perturbations with $r=1.0$. We release our code and models at https://github.com/eth-sri/aces.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
π»
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
π»
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
π»
Ghosted