TY - GEN
T1 - NEUROSTRUCTURAL DECODING
T2 - 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023
AU - Bastan, Mohaddeseh
AU - Surdeanu, Mihai
AU - Balasubramanian, Niranjan
N1 - Publisher Copyright:
© 2023 Association for Computational Linguistics.
PY - 2023
Y1 - 2023
N2 - Text generation often involves producing texts that also satisfy a given set of semantic constraints. While most approaches for conditional text generation have primarily focused on lexical constraints, they often struggle to effectively incorporate syntactic constraints, which provide a richer language for approximating semantic constraints. We address this gap by introducing NEUROSTRUCTURAL DECODING, a new decoding algorithm that incorporates syntactic constraints to further improve the quality of the generated text. We build NEUROSTRUCTURAL DECODING on the NeuroLogic Decoding (Lu et al., 2021b) algorithm, which enables language generation models to produce fluent text while satisfying complex lexical constraints. Our algorithm is powerful and scalable. It tracks lexico-syntactic constraints (e.g., we need to observe dog as subject and ball as object) during decoding by parsing the partial generations at each step. To this end, we adapt a dependency parser to generate parses for incomplete sentences. Our approach is evaluated on three different language generation tasks, and the results show improved performance in both lexical and syntactic metrics compared to previous methods. The results suggest this is a promising solution for integrating fine-grained controllable text generation into the conventional beam search decoding.
AB - Text generation often involves producing texts that also satisfy a given set of semantic constraints. While most approaches for conditional text generation have primarily focused on lexical constraints, they often struggle to effectively incorporate syntactic constraints, which provide a richer language for approximating semantic constraints. We address this gap by introducing NEUROSTRUCTURAL DECODING, a new decoding algorithm that incorporates syntactic constraints to further improve the quality of the generated text. We build NEUROSTRUCTURAL DECODING on the NeuroLogic Decoding (Lu et al., 2021b) algorithm, which enables language generation models to produce fluent text while satisfying complex lexical constraints. Our algorithm is powerful and scalable. It tracks lexico-syntactic constraints (e.g., we need to observe dog as subject and ball as object) during decoding by parsing the partial generations at each step. To this end, we adapt a dependency parser to generate parses for incomplete sentences. Our approach is evaluated on three different language generation tasks, and the results show improved performance in both lexical and syntactic metrics compared to previous methods. The results suggest this is a promising solution for integrating fine-grained controllable text generation into the conventional beam search decoding.
UR - http://www.scopus.com/inward/record.url?scp=85174389640&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85174389640&partnerID=8YFLogxK
U2 - 10.18653/v1/2023.acl-long.528
DO - 10.18653/v1/2023.acl-long.528
M3 - Conference contribution
AN - SCOPUS:85174389640
T3 - Proceedings of the Annual Meeting of the Association for Computational Linguistics
SP - 9496
EP - 9510
BT - Long Papers
PB - Association for Computational Linguistics (ACL)
Y2 - 9 July 2023 through 14 July 2023
ER -