TY - GEN
T1 - Exploring Text Representations for Generative Temporal Relation Extraction
AU - Dligach, Dmitriy
AU - Miller, Timothy
AU - Bethard, Steven
AU - Savova, Guergana
N1 - Funding Information:
Research reported in this publication was supported by the National Library Of Medicine of the National Institutes of Health under Award Numbers R01LM012973 and R01LM010090. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Publisher Copyright:
© 2022 Association for Computational Linguistics.
PY - 2022
Y1 - 2022
N2 - Sequence-to-sequence models are appealing because they allow both encoder and decoder to be shared across many tasks by formulating those tasks as text-to-text problems. Despite recently reported successes of such models, we find that engineering input/output representations for such text-to-text models is challenging. On the Clinical TempEval 2016 relation extraction task, the most natural choice of output representations, where relations are spelled out in simple predicate logic statements, did not lead to good performance. We explore a variety of input/output representations, with the most successful prompting one event at a time, and achieving results competitive with standard pairwise temporal relation extraction systems.
AB - Sequence-to-sequence models are appealing because they allow both encoder and decoder to be shared across many tasks by formulating those tasks as text-to-text problems. Despite recently reported successes of such models, we find that engineering input/output representations for such text-to-text models is challenging. On the Clinical TempEval 2016 relation extraction task, the most natural choice of output representations, where relations are spelled out in simple predicate logic statements, did not lead to good performance. We explore a variety of input/output representations, with the most successful prompting one event at a time, and achieving results competitive with standard pairwise temporal relation extraction systems.
UR - http://www.scopus.com/inward/record.url?scp=85138302057&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85138302057&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85138302057
T3 - ClinicalNLP 2022 - 4th Workshop on Clinical Natural Language Processing, Proceedings
SP - 109
EP - 113
BT - ClinicalNLP 2022 - 4th Workshop on Clinical Natural Language Processing, Proceedings
A2 - Naumann, Tristan
A2 - Bethard, Steven
A2 - Roberts, Kirk
A2 - Rumshisky, Anna
PB - Association for Computational Linguistics (ACL)
T2 - 4th Workshop on Clinical Natural Language Processing, ClinicalNLP 2022
Y2 - 14 July 2022
ER -