Exploration of Neural Machine Translation in Autoformalization of Mathematics in Mizar

December 05, 2019 ยท The Ethereal ยท ๐Ÿ› Certified Programs and Proofs

๐Ÿ”ฎ THE ETHEREAL: The Ethereal
Pure theory โ€” exists on a plane beyond code

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Qingxiang Wang, Chad Brown, Cezary Kaliszyk, Josef Urban arXiv ID 1912.02636 Category cs.LO: Logic in CS Cross-listed cs.CL, cs.LG Citations 59 Venue Certified Programs and Proofs Last Checked 2 months ago
Abstract
In this paper we share several experiments trying to automatically translate informal mathematics into formal mathematics. In our context informal mathematics refers to human-written mathematical sentences in the LaTeX format; and formal mathematics refers to statements in the Mizar language. We conducted our experiments against three established neural network-based machine translation models that are known to deliver competitive results on translating between natural languages. To train these models we also prepared four informal-to-formal datasets. We compare and analyze our results according to whether the model is supervised or unsupervised. In order to augment the data available for auto-formalization and improve the results, we develop a custom type-elaboration mechanism and integrate it in the supervised translation.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Logic in CS