TY - GEN
T1 - Recurrent neural network architectures for event extraction from italian medical reports
AU - Viani, Natalia
AU - Miller, Timothya
AU - Dligach, Dmitriy
AU - Bethard, Steven
AU - Napolitano, Carlo
AU - Priori, Silviag
AU - Bellazzi, Riccardo
AU - Sacchi, Lucia
AU - Savova, Guergana K.
N1 - Publisher Copyright:
© Springer International Publishing AG 2017.
PY - 2017
Y1 - 2017
N2 - Medical reports include many occurrences of relevant events in the form of free-text. To make data easily accessible and improve medical decisions, clinical information extraction is crucial. Traditional extraction methods usually rely on the availability of external resources, or require complex annotated corpora and elaborate designed features. Especially for languages other than English, progress has been limited by scarce availability of tools and resources. In this work, we explore recurrent neural network (RNN) architectures for clinical event extraction from Italian medical reports. The proposed model includes an embedding layer and an RNN layer. To find the best configuration for event extraction, we explored different RNN architectures, including Long Short Term Memory (LSTM) and Gated Recurrent Unit (GRU). We also tried feeding morpho-syntactic information into the network. The best result was obtained by using the GRU network with additional morpho-syntactic inputs.
AB - Medical reports include many occurrences of relevant events in the form of free-text. To make data easily accessible and improve medical decisions, clinical information extraction is crucial. Traditional extraction methods usually rely on the availability of external resources, or require complex annotated corpora and elaborate designed features. Especially for languages other than English, progress has been limited by scarce availability of tools and resources. In this work, we explore recurrent neural network (RNN) architectures for clinical event extraction from Italian medical reports. The proposed model includes an embedding layer and an RNN layer. To find the best configuration for event extraction, we explored different RNN architectures, including Long Short Term Memory (LSTM) and Gated Recurrent Unit (GRU). We also tried feeding morpho-syntactic information into the network. The best result was obtained by using the GRU network with additional morpho-syntactic inputs.
KW - Information extraction
KW - Natural language processing
KW - Neural network models
UR - http://www.scopus.com/inward/record.url?scp=85021637074&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85021637074&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-59758-4_21
DO - 10.1007/978-3-319-59758-4_21
M3 - Conference contribution
AN - SCOPUS:85021637074
SN - 9783319597577
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 198
EP - 202
BT - Artificial Intelligence in Medicine - 16th Conference on Artificial Intelligence in Medicine, AIME 2017, Proceedings
A2 - [surname]ten Teije, Annette
A2 - Popow, Christian
A2 - Sacchi, Lucia
A2 - Holmes, John H.
PB - Springer-Verlag
T2 - 16th Conference on Artificial Intelligence in Medicine, AIME 2017
Y2 - 21 June 2017 through 24 June 2017
ER -