Domain adaptation in practice: Lessons from a real-world information extraction pipeline

Timothy Miller, Egoitz Laparra, Steven Bethard

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Advances in transfer learning and domain adaptation have raised hopes that once-challenging NLP tasks are ready to be put to use for sophisticated information extraction needs. In this work, we describe an effort to do just that – combining state-of-the-art neural methods for negation detection, document time relation extraction, and aspectual link prediction, with the eventual goal of extracting drug timelines from electronic health record text. We train on the THYME colon cancer corpus and test on both the THYME brain cancer corpus and an internal corpus, and show that performance of the combined systems is unacceptable despite good performance of individual systems. Although domain adaptation shows improvements on each individual system, the model selection problem is a barrier to improving overall pipeline performance.

Original languageEnglish (US)
Title of host publicationAdapt-NLP 2021 - 2nd Workshop on Domain Adaptation for NLP, Proceedings
EditorsEyal Ben-David, Shay Cohen, Ryan McDonald, Barbara Plank, Roi Reichart, Guy Rotman, Yftah Ziser
PublisherAssociation for Computational Linguistics (ACL)
Pages105-110
Number of pages6
ISBN (Electronic)9781954085084
StatePublished - 2021
Event2nd Workshop on Domain Adaptation for NLP, Adapt-NLP 2021 - Kyiv, Ukraine
Duration: Apr 20 2021 → …

Publication series

NameAdapt-NLP 2021 - 2nd Workshop on Domain Adaptation for NLP, Proceedings

Conference

Conference2nd Workshop on Domain Adaptation for NLP, Adapt-NLP 2021
Country/TerritoryUkraine
CityKyiv
Period4/20/21 → …

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software

Fingerprint

Dive into the research topics of 'Domain adaptation in practice: Lessons from a real-world information extraction pipeline'. Together they form a unique fingerprint.

Cite this