Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification

Mitch Paul Mithun, Sandeep Suntwal, Mihai Surdeanu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

While neural networks produce state-of-the-art performance in several NLP tasks, they depend heavily on lexicalized information, which transfers poorly between domains. Previous work (Suntwal et al., 2019) proposed delexicalization as a form of knowledge distillation to reduce dependency on such lexical artifacts. However, a critical unsolved issue that remains is how much delexicalization should be applied? A little helps reduce over-fitting, but too much discards useful information. We propose Group Learning (GL), a knowledge and model distillation approach for fact verification. In our method, while multiple student models have access to different delexicalized data views, they are encouraged to independently learn from each other through pair-wise consistency losses. In several cross-domain experiments between the FEVER and FNC fact verification datasets, we show that our approach learns the best delexicalization strategy for the given training dataset and outperforms state-of-the-art classifiers that rely on the original data.

Original languageEnglish (US)
Title of host publicationEMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
PublisherAssociation for Computational Linguistics (ACL)
Pages6968-6973
Number of pages6
ISBN (Electronic)9781955917094
StatePublished - 2021
Event2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021 - Virtual, Punta Cana, Dominican Republic
Duration: Nov 7 2021Nov 11 2021

Publication series

NameEMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings

Conference

Conference2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021
Country/TerritoryDominican Republic
CityVirtual, Punta Cana
Period11/7/2111/11/21

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Computer Science Applications
  • Information Systems

Fingerprint

Dive into the research topics of 'Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification'. Together they form a unique fingerprint.

Cite this