Taurus: A Data Plane Architecture for Per-Packet ML
February 12, 2020 ยท Declared Dead ยท ๐ International Conference on Architectural Support for Programming Languages and Operating Systems
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Tushar Swamy, Alexander Rucker, Muhammad Shahbaz, Ishan Gaur, Kunle Olukotun
arXiv ID
2002.08987
Category
cs.NI: Networking & Internet
Cross-listed
cs.LG,
cs.PF
Citations
110
Venue
International Conference on Architectural Support for Programming Languages and Operating Systems
Last Checked
2 months ago
Abstract
Emerging applications -- cloud computing, the internet of things, and augmented/virtual reality -- demand responsive, secure, and scalable datacenter networks. These networks currently implement simple, per-packet, data-plane heuristics (e.g., ECMP and sketches) under a slow, millisecond-latency control plane that runs data-driven performance and security policies. However, to meet applications' service-level objectives (SLOs) in a modern data center, networks must bridge the gap between line-rate, per-packet execution and complex decision making. In this work, we present the design and implementation of Taurus, a data plane for line-rate inference. Taurus adds custom hardware based on a flexible, parallel-patterns (MapReduce) abstraction to programmable network devices, such as switches and NICs; this new hardware uses pipelined SIMD parallelism to enable per-packet MapReduce operations (e.g., inference). Our evaluation of a Taurus switch ASIC -- supporting several real-world models -- shows that Taurus operates orders of magnitude faster than a server-based control plane while increasing area by 3.8% and latency for line-rate ML models by up to 221 ns. Furthermore, our Taurus FPGA prototype achieves full model accuracy and detects two orders of magnitude more events than a state-of-the-art control-plane anomaly-detection system.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Networking & Internet
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Federated Learning in Mobile Edge Networks: A Comprehensive Survey
R.I.P.
๐ป
Ghosted
A Survey of Indoor Localization Systems and Technologies
R.I.P.
๐ป
Ghosted
Survey of Important Issues in UAV Communication Networks
R.I.P.
๐ป
Ghosted
Network Function Virtualization: State-of-the-art and Research Challenges
R.I.P.
๐ป
Ghosted
Applications of Deep Reinforcement Learning in Communications and Networking: A Survey
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted