University of Southern California
  The Interactive Emotional Dyadic Motion Capture (IEMOCAP) Database

  Home

  More Info

  Release

  Publications

 
 

IEMOCAP Database

The Interactive Emotional Dyadic Motion Capture (IEMOCAP) database is an acted, multimodal and multispeaker database, recently collected at SAIL lab at USC. It contains approximately 12 hours of audiovisual data, including video, speech, motion capture of face, text transcriptions. It consists of dyadic sessions where actors perform improvisations or scripted scenarios, specifically selected to elicit emotional expressions. IEMOCAP database is annotated by multiple annotators into categorical labels, such as anger, happiness, sadness, neutrality, as well as dimensional labels such as valence, activation and dominance. The detailed motion capture information, the interactive setting to elicit authentic emotions, and the size of the database make this corpus a valuable addition to the existing databases in the community for the study and modeling of multimodal and expressive human communication.

 

IMSC | SIPI | EE-Systems | University of Southern California

(c) 2004 Speech Analysis & Interpretation Laboratory

3710 S. McClintock Ave, RTH 320
Los Angeles, CA 90089, U.S.A