Neural Network Distiller: A Python Package For DNN Compression Research

October 27, 2019 ยท Declared Dead ยท ๐Ÿ› arXiv.org

๐Ÿ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Neta Zmora, Guy Jacob, Lev Zlotnik, Bar Elharar, Gal Novik arXiv ID 1910.12232 Category cs.LG: Machine Learning Cross-listed stat.ML Citations 75 Venue arXiv.org Repository https://github.com/NervanaSystems/distiller Last Checked 2 months ago
Abstract
This paper presents the philosophy, design and feature-set of Neural Network Distiller, an open-source Python package for DNN compression research. Distiller is a library of DNN compression algorithms implementations, with tools, tutorials and sample applications for various learning tasks. Its target users are both engineers and researchers, and the rich content is complemented by a design-for-extensibility to facilitate new research. Distiller is open-source and is available on Github at https://github.com/NervanaSystems/distiller.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ’€ 404 Not Found