Classical Sequence Match is a Competitive Few-Shot One-Class Learner

September 14, 2022 ยท Entered Twilight ยท ๐Ÿ› International Conference on Computational Linguistics

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: 1. SN.py, 10. BiCA+MAML.py, 2. OWP.py, 3. CA.py, 4. BiCA & BiCA+finetune.py, 5. DistilBert & DistilBert+finetune.py, 6. BERT & BERT+finetune.py, 7. BERT(p) & BERT(p)+finetune.py, 8. BERT+MAML.py, 9. DistilBert+MAML.py, LICENSE, README.md, compute_cov_score, datasets

Authors Mengting Hu, Hang Gao, Yinhao Bai, Mingming Liu arXiv ID 2209.06394 Category cs.LG: Machine Learning Cross-listed cs.CL Citations 0 Venue International Conference on Computational Linguistics Repository https://github.com/hmt2014/FewOne โญ 3 Last Checked 2 months ago
Abstract
Nowadays, transformer-based models gradually become the default choice for artificial intelligence pioneers. The models also show superiority even in the few-shot scenarios. In this paper, we revisit the classical methods and propose a new few-shot alternative. Specifically, we investigate the few-shot one-class problem, which actually takes a known sample as a reference to detect whether an unknown instance belongs to the same class. This problem can be studied from the perspective of sequence match. It is shown that with meta-learning, the classical sequence match method, i.e. Compare-Aggregate, significantly outperforms transformer ones. The classical approach requires much less training cost. Furthermore, we perform an empirical comparison between two kinds of sequence match approaches under simple fine-tuning and meta-learning. Meta-learning causes the transformer models' features to have high-correlation dimensions. The reason is closely related to the number of layers and heads of transformer models. Experimental codes and data are available at https://github.com/hmt2014/FewOne
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning