๐ฎ
๐ฎ
The Ethereal
Backprop as Functor: A compositional perspective on supervised learning
November 28, 2017 ยท The Ethereal ยท ๐ Logic in Computer Science
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Brendan Fong, David I. Spivak, Rรฉmy Tuyรฉras
arXiv ID
1711.10455
Category
math.CT: Category Theory
Cross-listed
cs.AI,
cs.LG
Citations
118
Venue
Logic in Computer Science
Last Checked
1 month ago
Abstract
A supervised learning algorithm searches over a set of functions $A \to B$ parametrised by a space $P$ to find the best approximation to some ideal function $f\colon A \to B$. It does this by taking examples $(a,f(a)) \in A\times B$, and updating the parameter according to some rule. We define a category where these update rules may be composed, and show that gradient descent---with respect to a fixed step size and an error function satisfying a certain property---defines a monoidal functor from a category of parametrised functions to this category of update rules. This provides a structural perspective on backpropagation, as well as a broad generalisation of neural networks.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Category Theory
๐ฎ
๐ฎ
The Ethereal
Open Diagrams via Coend Calculus
๐ฎ
๐ฎ
The Ethereal
Executions in (Semi-)Integer Petri Nets are Compact Closed Categories
๐ฎ
๐ฎ
The Ethereal
Compositional Scientific Computing with Catlab and SemanticModels
๐ฎ
๐ฎ
The Ethereal
Computational Petri Nets: Adjunctions Considered Harmful
๐ฎ
๐ฎ
The Ethereal