๐ฎ
๐ฎ
The Ethereal
Exploration of Neural Machine Translation in Autoformalization of Mathematics in Mizar
December 05, 2019 ยท The Ethereal ยท ๐ Certified Programs and Proofs
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Qingxiang Wang, Chad Brown, Cezary Kaliszyk, Josef Urban
arXiv ID
1912.02636
Category
cs.LO: Logic in CS
Cross-listed
cs.CL,
cs.LG
Citations
59
Venue
Certified Programs and Proofs
Last Checked
2 months ago
Abstract
In this paper we share several experiments trying to automatically translate informal mathematics into formal mathematics. In our context informal mathematics refers to human-written mathematical sentences in the LaTeX format; and formal mathematics refers to statements in the Mizar language. We conducted our experiments against three established neural network-based machine translation models that are known to deliver competitive results on translating between natural languages. To train these models we also prepared four informal-to-formal datasets. We compare and analyze our results according to whether the model is supervised or unsupervised. In order to augment the data available for auto-formalization and improve the results, we develop a custom type-elaboration mechanism and integrate it in the supervised translation.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Logic in CS
๐ฎ
๐ฎ
The Ethereal
Safe Reinforcement Learning via Shielding
๐ฎ
๐ฎ
The Ethereal
Formal Verification of Piece-Wise Linear Feed-Forward Neural Networks
๐ฎ
๐ฎ
The Ethereal
Heterogeneous substitution systems revisited
๐ฎ
๐ฎ
The Ethereal
Omega-Regular Objectives in Model-Free Reinforcement Learning
๐ฎ
๐ฎ
The Ethereal