Skip to main navigation Skip to search Skip to main content

Self-training improves Recurrent Neural Networks performance for Temporal Relation Extraction

  • Chen Lin
  • , Timothy A. Miller
  • , Dmitriy Dligach
  • , Hadi Amiri
  • , Steven Bethard
  • , Guergana Savova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural network models are oftentimes restricted by limited labeled instances and resort to advanced architectures and features for cutting edge performance. We propose to build a recurrent neural network with multiple semantically heterogeneous embeddings within a self-training framework. Our framework makes use of labeled, unlabeled, and social media data, operates on basic features, and is scalable and generalizable. With this method, we establish the state-of-the-art result for both in- and cross-domain for a clinical temporal relation extraction task.

Original languageEnglish (US)
Title of host publicationEMNLP 2018 - 9th International Workshop on Health Text Mining and Information Analysis, LOUHI 2018 - Proceedings of the Workshop
PublisherAssociation for Computational Linguistics (ACL)
Pages165-176
Number of pages12
ISBN (Electronic)9781948087742
StatePublished - 2018
Event9th International Workshop on Health Text Mining and Information Analysis, LOUHI 2018, co-located with EMNLP 2018 - Brussels, Belgium
Duration: Oct 31 2018 → …

Publication series

NameEMNLP 2018 - 9th International Workshop on Health Text Mining and Information Analysis, LOUHI 2018 - Proceedings of the Workshop

Conference

Conference9th International Workshop on Health Text Mining and Information Analysis, LOUHI 2018, co-located with EMNLP 2018
Country/TerritoryBelgium
CityBrussels
Period10/31/18 → …

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Self-training improves Recurrent Neural Networks performance for Temporal Relation Extraction'. Together they form a unique fingerprint.

Cite this