Descending-path convolution kernel for syntactic structures

Chen Lin, Timothy Miller, Alvin Kho, Steven Bethard, Dmitriy Dligach, Sameer Pradhan, Guergana Savova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

Convolution tree kernels are an efficient and effective method for comparing syntactic structures in NLP methods. However, current kernel methods such as subset tree kernel and partial tree kernel understate the similarity of very similar tree structures. Although soft-matching approaches can improve the similarity scores, they are corpusdependent and match relaxations may be task-specific. We propose an alternative approach called descending path kernel which gives intuitive similarity scores on comparable structures. This method is evaluated on two temporal relation extraction tasks and demonstrates its advantage over rich syntactic representations.

Original languageEnglish (US)
Title of host publicationLong Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages81-86
Number of pages6
ISBN (Print)9781937284732
DOIs
StatePublished - 2014
Externally publishedYes
Event52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Baltimore, MD, United States
Duration: Jun 22 2014Jun 27 2014

Publication series

Name52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Proceedings of the Conference
Volume2

Other

Other52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014
Country/TerritoryUnited States
CityBaltimore, MD
Period6/22/146/27/14

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language

Fingerprint

Dive into the research topics of 'Descending-path convolution kernel for syntactic structures'. Together they form a unique fingerprint.

Cite this