We study expression of emotions in behavioral (e.g., voice, speech, language, face, body gestures) and physiological signals, and their perception and processing by humans and machines.
We create affect-aware technologies.

  • We study multimodal and unimodal emotion expression patterns

  • We study multimodal and unimodal emotion expression patterns

  • We study and model the dyadic interaction using body gestures and speech

  • We We study the expression, perception and acquisition of emotional language

  • We study speech production in emotional speech

  • We win 5 Interspeech Paralinguistics Challenges in 2009, 2011, 2012, 2013 and 2014!!

IEMOCAP database

Interactive emotional dyadic motion capture database: Two people express assigned emotion to each other

Arousal rating

Arousal rating from body gestures and speech signals in dyadic interaction.

real-time MRI

Mid-sagittal plane of the vocal tract is recorded while actors/actress were uttering emotional speech