Backprop as Functor: A compositional perspective on supervised learning

November 28, 2017 ยท The Ethereal ยท ๐Ÿ› Logic in Computer Science

๐Ÿ”ฎ THE ETHEREAL: The Ethereal
Pure theory โ€” exists on a plane beyond code

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Brendan Fong, David I. Spivak, Rรฉmy Tuyรฉras arXiv ID 1711.10455 Category math.CT: Category Theory Cross-listed cs.AI, cs.LG Citations 118 Venue Logic in Computer Science Last Checked 1 month ago
Abstract
A supervised learning algorithm searches over a set of functions $A \to B$ parametrised by a space $P$ to find the best approximation to some ideal function $f\colon A \to B$. It does this by taking examples $(a,f(a)) \in A\times B$, and updating the parameter according to some rule. We define a category where these update rules may be composed, and show that gradient descent---with respect to a fixed step size and an error function satisfying a certain property---defines a monoidal functor from a category of parametrised functions to this category of update rules. This provides a structural perspective on backpropagation, as well as a broad generalisation of neural networks.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Category Theory

๐Ÿ”ฎ ๐Ÿ”ฎ The Ethereal

Algebraic Databases

Patrick Schultz, David I. Spivak, ... (+2 more)

math.CT ๐Ÿ› Theory and Applications of Categories ๐Ÿ“š 35 cites 10 years ago
๐Ÿ”ฎ ๐Ÿ”ฎ The Ethereal

Formal composition of hybrid systems

Jared Culbertson, Paul Gustafson, ... (+2 more)

math.CT ๐Ÿ› Theory and Applications of Categories ๐Ÿ“š 11 cites 6 years ago