Improving temporal relation extraction with training instance augmentation

Chen Lin, Timothy Miller, Dmitriy Dligach, Steven Bethard, Guergana Savova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Scopus citations

Abstract

Temporal relation extraction is important for understanding the ordering of events in narrative text. We describe a method for increasing the number of high-quality training instances available to a temporal relation extraction task, with an adaptation to different annotation styles in the clinical domain by taking advantage of the Unified Medical Language System (UMLS). This method notably improves clinical temporal relation extraction, works beyond featurizing or duplicating the same information, can generalize between-argument signals in a more effective and robust fashion. We also report a new state-of-the-art result, which is a two point improvement over the best Clinical TempEval 2016 system.

Original languageEnglish (US)
Title of host publicationBioNLP 2016 - Proceedings of the 15th Workshop on Biomedical Natural Language Processing
EditorsKevin Bretonnel Cohen, Dina Demner-Fushman, Sophia Ananiadou, Jun-ichi Tsujii
PublisherAssociation for Computational Linguistics (ACL)
Pages108-113
Number of pages6
ISBN (Electronic)9781945626128
StatePublished - 2016
Externally publishedYes
Event15th Workshop on Biomedical Natural Language Processing, BioNLP 2016 - Berlin, Germany
Duration: Aug 12 2016 → …

Publication series

NameBioNLP 2016 - Proceedings of the 15th Workshop on Biomedical Natural Language Processing

Conference

Conference15th Workshop on Biomedical Natural Language Processing, BioNLP 2016
Country/TerritoryGermany
CityBerlin
Period8/12/16 → …

ASJC Scopus subject areas

  • Biomedical Engineering
  • Language and Linguistics
  • Information Systems
  • Software
  • Health Informatics
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Improving temporal relation extraction with training instance augmentation'. Together they form a unique fingerprint.

Cite this