Automated analysis of interactional synchrony using robust facial tracking and expression recognition

Xiang Yu, Shaoting Zhang, Yang Yu, Norah Dunbar, Matthew Jensen, Judee K. Burgoon, Dimitris N. Metaxas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

In this paper, we propose an automated, data-driven and unobtrusive framework to analyze interactional synchrony. We use this information to determine whether interpersonal synchrony can be an indicator of deceit. Our framework includes a robust facial tracking module, an effective expression recognition method, synchrony feature extraction and feature selection methods. These synchrony features are used to learn classification models for the deception recognition. To evaluate our proposed framework, we have conducted extensive experiments on a database of 242 video samples. We validate the performance of each technical module in our framework, and also show that these synchrony features are very effective at detecting deception.

Original languageEnglish (US)
Title of host publication2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition, FG 2013
DOIs
StatePublished - 2013
Event2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition, FG 2013 - Shanghai, China
Duration: Apr 22 2013Apr 26 2013

Publication series

Name2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition, FG 2013

Other

Other2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition, FG 2013
Country/TerritoryChina
CityShanghai
Period4/22/134/26/13

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Automated analysis of interactional synchrony using robust facial tracking and expression recognition'. Together they form a unique fingerprint.

Cite this