Publications

Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information

Abstract

We address the problem of tracking continuous levels of a participant's activation, valence and dominance during the course of affective dyadic interactions, where participants may be speaking, listening or doing neither. To this end, we extract detailed and intuitive descriptions of each participant's body movements, posture and behavior towards his interlocutor, and speech information. We apply a Gaussian Mixture Model-based approach which computes a mapping from a set of observed audio–visual cues to an underlying emotional state. We obtain promising results for tracking trends of participants' activation and dominance values, which outperform other regression-based approaches used in the literature. Additionally, we shed light into the way expressive body language is modulated by underlying emotional states in the context of dyadic interactions.

Date
2013
Authors
Angeliki Metallinou, Athanasios Katsamanis, Shrikanth Narayanan
Journal
Image and Vision Computing
Volume
31
Issue
2
Pages
137-152
Publisher
Elsevier