Neural temporal relation extraction

Dmitriy Dligach, Timothy Miller, Chen Lin, Steven Bethard, Guergana Savova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

82 Scopus citations

Abstract

We experiment with neural architectures for temporal relation extraction and establish a new state-of-the-art for several scenarios. We find that neural models with only tokens as input outperform state-ofthe- art hand-engineered feature-based models, that convolutional neural networks outperform LSTM models, and that encoding relation arguments with XML tags outperforms a traditional position-based encoding.

Original languageEnglish (US)
Title of host publicationShort Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages746-751
Number of pages6
ISBN (Electronic)9781510838604
DOIs
StatePublished - 2017
Event15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Valencia, Spain
Duration: Apr 3 2017Apr 7 2017

Publication series

Name15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference
Volume2

Conference

Conference15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017
Country/TerritorySpain
CityValencia
Period4/3/174/7/17

ASJC Scopus subject areas

  • Linguistics and Language
  • Language and Linguistics

Fingerprint

Dive into the research topics of 'Neural temporal relation extraction'. Together they form a unique fingerprint.

Cite this