Coding Small Group Communication with AI: RNNs and Transformers with Context

Andrew Pilny, Joseph Bonito, Aaron Schecter

Research output: Contribution to journalArticlepeer-review

Abstract

This study compares the performance of recurrent neural networks (RNNs) and transformer-based models (DistilBERT) in classifying utterances as dialogue acts. The results show that transformers consistently outperform RNNs, highlighting their usefulness in coding small group interaction. Furthermore, the study explores the impact of incorporating context, in the form of preceding and following utterances. The findings reveal that adding context leads to modest improvements in model performance. Moreover, in some cases, adding context can lead to a slight decrease in performance. The study discusses the implications of these findings for small group researchers employing AI models for text classification tasks.

Original languageEnglish (US)
JournalSmall Group Research
DOIs
StateAccepted/In press - 2025

Keywords

  • communication
  • content analysis
  • interaction analysis
  • meetings

ASJC Scopus subject areas

  • Social Psychology
  • Applied Psychology

Fingerprint

Dive into the research topics of 'Coding Small Group Communication with AI: RNNs and Transformers with Context'. Together they form a unique fingerprint.

Cite this