Full Release of the IEMOCAP database
We are releasing the full IEMOCAP database for academic purposes. The database contains facial Motion Capture data, speech and video from mutliple participants during improvised affective dyadic interactions. You can obtain the database for free, just by signing a release form. See here for further details.
Emotion Tracking Code Release
You can download the tar.gz. file containing the code here [CODE UPDATED!]
This package contains code that can be used for estimating continuous emotional states of a subject through time based on his/her body language information. More generally in can be used for estimating a continuous random variable through time based on a set of other continuous random variables. For more details regarding the features, tracking method and experiments, please see the corresponding publication:
Angeliki Metallinou, Athanasios Katsamanis and Shrikanth Narayanan, Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information, Image and Vision Computing, Special Issue on Continuous Affect Analysis, accepted for publication 2012
In this package, we are releasing a small subset of the data that were used in the above publication, specifically extracted body language features and activation values of 2 subjects for 6 recordings. This is a limited release to demonstrate the use of the code. We are not replicating the setup/results of our experiments in the above paper, as this would require a full release of the data. We are preparing a full release of the data for the future, so if you are interested please check this website for updates.