Home
Research
News
CV
Publications
Coursework
Data/Code Release
Personal










Current Research Threads and Projects

  • Context-Sensitive Emotion Recognition using Facial and Speech Information

    Emotions do not happen in isolation but in the context of a convestation topic, a setting, an emotional history etc. For example, it seems reasonable to assume that an angry utterance is more likely to be succeeded by one displaying anger rather than happiness. My research examines how this type of temporal context could be utilized to improve emotion recognition performance through multimodal classification schemes that model past and possibly future emotional evolution. Apart from temporal context, one could think of other higher-level types of context, such as conversation topic or general situational understanding. Finding ways to incorporate such context in emotion recognition systems constitute exciting current research directions.

    For more details see our recent publication:

    • Angeliki Metallinou, Martin Woellmer, Athanasios Katsamanis, Florian Eyben, Bjoern Schuller and Shrikanth Narayanan, Context-Sensitive Learning for Enhanced Audiovisual Emotion Classification, IEEE Transactions of Affective Computing (TAC), Vol. 3 , No. 2, pp. 184-198, April-June 2012

    For details about the research threads of the USC Emotion Group, please visit our website. You may also be interested in a recently collected multimodal emotional database, the IEMOCAP database, which we are currently releasing.

  • Continuously Tracking Emotions During Affective Dyadic Interactions based on Body Language and Speech Information (CreativeIT project)

    Emotional expressions are not static; they tend to evolve during the course of an interaction with variable intensity and clarity. For example covert anger might escalate to fury or reduce to neutrality according to how a situation or conversation evolves. This motivates us to think of emotions as continuous variables evolving through time, and to reformulate the problem of emotion recognition as emotion tracking. My research includes tracking trends of emotional attributes, such as intensity, or degree of negativity, of participants during affective interactions using audiovisual observations from their interaction. This could highlight regions of interest, e.g where abrupt changes in emotional intensity happen, which might indicate interesting, or emotionally salient events.

    The study of affective full body language in the context of dyadic interaction is an additional interesting aspect of this work. In constrast to speech and facial expression, affective body language is relatively less studied in the affective computing literature. In our work, we examine how our body gestures, posture as well as body behavior towards an interlocutor is modulated to reflect an underlying emotional state.

    For more details see our recently accepted publication:

    • Angeliki Metallinou, Athanasios Katsamanis and Shrikanth Narayanan, Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information, Image and Vision Computing, Special Issue on Affect Analysis in Continuous Input, Vol.31, Issue 2, pp. 137-152, Feb. 2013

    We are also releasing part of our code here

  • CreativeIT project

    An important part of this work is the USC CreativeIT database, which contains a great variety of theatrical improvisations, collected in a collaborative effort between SAIL lab and the USC Theater School. The USC CreativeIT database is a rich corpus of expressive dyadic interactions that can be used for various reseach directions including analysis of actors' improvisation and creativity, and animation of affect-sensitive virtual agents.

    Here you can find a related article and video, created by BBC, describing the innovative CreativeIT project.

    You could also click here for a related article made by the USC Viterbi School of Engineering.

  • Applications on Human Behavior Analysis (Autism)

    The Behavioral Signal Processing subgroup at SAIL lab aims to apply computational methodologies at data that were once only analyzed by observational researchers, such as recordings of children with Autism Spectrum Disorders (ASD). This is part of our effort to describe quantitatively certain qualitative psychological observations, in order to bring new insights on 'atypical' human behavior, behavior under emotional stress, human interactions and problem solving. Specifically our current work focuses on quantifying atypicality/awkwardness in affective facial expressions of children with ASD.

    For more details see our recently accepted publication:

    • Angeliki Metallinou, Ruth B. Grossman and Shrikanth Narayanan, Quantifying Atypicality In Affective Facial Expressions Of Children With Autism Spectrum Disorders, IEEE International Conference on Multimedia & Expo (ICME), San Jose, CA, 2013