Exploring transformers and time lag features for predicting changes in mood over time

John Culnan, Damian Y. Romero Diaz, Steven Bethard

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations

Abstract

This paper presents transformer-based models created for the CLPsych 2022 shared task. Using posts from Reddit users over a period of time, we aim to predict changes in mood from post to post. We test models that preserve timeline information through explicit ordering of posts as well as those that do not order posts but preserve features on the length of time between a user’s posts. We find that a model with temporal information may provide slight benefits over the same model without such information, although a RoBERTa transformer model provides enough information to make similar predictions without custom-encoded time information.

Original languageEnglish (US)
Title of host publicationCLPsych 2022 - 8th Workshop on Computational Linguistics and Clinical Psychology, Proceedings
EditorsAyah Zirikly, Dana Atzil-Slonim, Maria Liakata, Steven Bedrick, Bart Desmet, Molly Ireland, Andrew Lee, Sean MacAvaney, Matthew Purver, Rebecca Resnik, Andrew Yates
PublisherAssociation for Computational Linguistics (ACL)
Pages226-231
Number of pages6
ISBN (Electronic)9781955917872
DOIs
StatePublished - 2022
Event8th Workshop on Computational Linguistics and Clinical Psychology, CLPsych 2022 - Seattle, United States
Duration: Jul 15 2022 → …

Publication series

NameCLPsych 2022 - 8th Workshop on Computational Linguistics and Clinical Psychology, Proceedings

Conference

Conference8th Workshop on Computational Linguistics and Clinical Psychology, CLPsych 2022
Country/TerritoryUnited States
CitySeattle
Period7/15/22 → …

ASJC Scopus subject areas

  • Language and Linguistics
  • Computer Networks and Communications
  • Speech and Hearing

Fingerprint

Dive into the research topics of 'Exploring transformers and time lag features for predicting changes in mood over time'. Together they form a unique fingerprint.

Cite this