ASPIRO: Any-shot Structured Parsing-error-Induced ReprOmpting for Consistent Data-to-Text Generation

October 27, 2023 ยท Entered Twilight ยท ๐Ÿ› Conference on Empirical Methods in Natural Language Processing

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: .gitattributes, ASPIRO-poster-final.pdf, Readme.md, data, error_analysis.py, flags.py, helpers.py, images, models.py, parsing.py, prompt_templates, requirements-local.txt, requirements.txt, run_aspiro.py, run_falcon_experiments.sh, run_webnlg_experiments.sh, run_webnlg_g3p5.sh, run_wikidata_asdot_experiments.sh, run_wikidata_json_experiments.sh, scripts, setups

Authors Martin Vejvar, Yasutaka Fujimoto arXiv ID 2310.17877 Category cs.CL: Computation & Language Cross-listed cs.AI, cs.LG Citations 2 Venue Conference on Empirical Methods in Natural Language Processing Repository https://github.com/vejvarm/ASPIRO Last Checked 2 months ago
Abstract
We present ASPIRO, an approach for structured data verbalisation into short template sentences in zero to few-shot settings. Unlike previous methods, our approach prompts large language models (LLMs) to directly produce entity-agnostic templates, rather than relying on LLMs to faithfully copy the given example entities, or validating/crafting the templates manually. We incorporate LLM re-prompting, triggered by algorithmic parsing checks, as well as the PARENT metric induced consistency validation to identify and rectify template generation problems in real-time. ASPIRO, compared to direct LLM output, averages 66\% parsing error rate reduction in generated verbalisations of RDF triples on the DART dataset. Our best 5-shot text-davinci-003 setup, scoring BLEU of 50.62, METEOR of 45.16, BLEURT of 0.82, NUBIA of 0.87, and PARENT of 0.8962 on the Rel2Text dataset, competes effectively with recent fine-tuned pre-trained language models.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computation & Language

๐ŸŒ… ๐ŸŒ… Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL ๐Ÿ› NeurIPS ๐Ÿ“š 166.0K cites 8 years ago