Learning to Reason: Leveraging Neural Networks for Approximate DNF Counting

April 04, 2019 Β· Entered Twilight Β· πŸ› AAAI Conference on Artificial Intelligence

πŸŒ… TWILIGHT: Old Age
Predates the code-sharing era β€” a pioneer of its time

"Last commit was 6.0 years ago (β‰₯5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .DS_Store, AnalyseData.py, DNFGen.py, DNFProblem.py, GraphNeuralNet.py, README.md, Train.py, generateData.py, generateTestData.py, netParams_2, runExperiments.py, runExperimentsBySize.py, runtimeTesting.py, visualisation.py

Authors Ralph Abboud, Ismail Ilkan Ceylan, Thomas Lukasiewicz arXiv ID 1904.02688 Category cs.AI: Artificial Intelligence Cross-listed cs.LG Citations 32 Venue AAAI Conference on Artificial Intelligence Repository https://github.com/ralphabb/NeuralDNF/ ⭐ 9 Last Checked 1 month ago
Abstract
Weighted model counting (WMC) has emerged as a prevalent approach for probabilistic inference. In its most general form, WMC is #P-hard. Weighted DNF counting (weighted #DNF) is a special case, where approximations with probabilistic guarantees are obtained in O(nm), where n denotes the number of variables, and m the number of clauses of the input DNF, but this is not scalable in practice. In this paper, we propose a neural model counting approach for weighted #DNF that combines approximate model counting with deep learning, and accurately approximates model counts in linear time when width is bounded. We conduct experiments to validate our method, and show that our model learns and generalizes very well to large-scale #DNF instances.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Artificial Intelligence