TY - GEN
T1 - A BERT-based one-pass multi-task model for clinical temporal relation extraction
AU - Lin, Chen
AU - Miller, Timothy
AU - Dligach, Dmitriy
AU - Sadeque, Farig
AU - Bethard, Steven
AU - Savova, Guergana
N1 - Funding Information:
The study was funded by R01LM10090, R01GM114355 and UG3CA243120 from the Unites States National Institutes of Health. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. We thank the anonymous reviewers for their valuable suggestions and criticism. The Titan Xp GPU used for this research was donated by the NVIDIA Corporation.
Publisher Copyright:
© Association for Computation Linguistics.
PY - 2020
Y1 - 2020
N2 - Recently BERT has achieved a state-of-theart performance in temporal relation extraction from clinical Electronic Medical Records text. However, the current approach is inefficient as it requires multiple passes through each input sequence. We extend a recently-proposed one-pass model for relation classification to a one-pass model for relation extraction. We augment this framework by introducing global embeddings to help with long-distance relation inference, and by multi-task learning to increase model performance and generalizability.
AB - Recently BERT has achieved a state-of-theart performance in temporal relation extraction from clinical Electronic Medical Records text. However, the current approach is inefficient as it requires multiple passes through each input sequence. We extend a recently-proposed one-pass model for relation classification to a one-pass model for relation extraction. We augment this framework by introducing global embeddings to help with long-distance relation inference, and by multi-task learning to increase model performance and generalizability.
UR - http://www.scopus.com/inward/record.url?scp=85100479095&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85100479095&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85100479095
T3 - Proceedings of the Annual Meeting of the Association for Computational Linguistics
SP - 70
EP - 75
BT - BioNLP 2020 - 19th SIGBioMed Workshop on Biomedical Language Processing, Proceedings of the Workshop
PB - Association for Computational Linguistics (ACL)
T2 - 19th SIGBioMed Workshop on Biomedical Language Processing, BioNLP 2020 at the 58th Annual Meeting of the Association for Computational Linguistics, ACL 2020
Y2 - 9 July 2020
ER -