Robust and Accurate -- Compositional Architectures for Randomized Smoothing

April 01, 2022 Β· Entered Twilight Β· πŸ› arXiv.org

πŸ’€ TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: README.md, analysis, analyze_utils.py, architectures.py, architectures_macer.py, archs, certify_ace.py, certify_selection.py, data, datasets.py, predict_core.py, requirements.txt, scripts, smooth_ace.py, smooth_selection.py

Authors Miklós Z. HorvÑth, Mark Niklas Müller, Marc Fischer, Martin Vechev arXiv ID 2204.00487 Category cs.LG: Machine Learning Cross-listed cs.AI, cs.CR Citations 14 Venue arXiv.org Repository https://github.com/eth-sri/aces ⭐ 3 Last Checked 2 months ago
Abstract
Randomized Smoothing (RS) is considered the state-of-the-art approach to obtain certifiably robust models for challenging tasks. However, current RS approaches drastically decrease standard accuracy on unperturbed data, severely limiting their real-world utility. To address this limitation, we propose a compositional architecture, ACES, which certifiably decides on a per-sample basis whether to use a smoothed model yielding predictions with guarantees or a more accurate standard model without guarantees. This, in contrast to prior approaches, enables both high standard accuracies and significant provable robustness. On challenging tasks such as ImageNet, we obtain, e.g., $80.0\%$ natural accuracy and $28.2\%$ certifiable accuracy against $\ell_2$ perturbations with $r=1.0$. We release our code and models at https://github.com/eth-sri/aces.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Machine Learning