Logo: University of Southern California


IEMOCAP Database

The Interactive Emotional Dyadic Motion Capture (IEMOCAP) database is an acted, multimodal and multispeaker database, recently collected at SAIL lab at USC. It contains approximately 12 hours of audiovisual data, including video, speech, motion capture of face, text transcriptions. (Read more...)

EMA Database

The Electromagnetic Articulography (EMA) database contains a total of 680 utterances spoken in four different target emotions, such as anger, happiness, sadness and neutrality. (Read more...)

MRI-TIMIT Database

MRI-TIMIT is a large-scale database of synchronized audio and real-time magnetic resonance imaging (rtMRI) data for speech research.The database currently consists of midsagittal upper airway MRI data and phonetically-transcribed companion audio, acquired from two male and two female speakers of American English. (Read more...)

USC-TIMIT Database

USC-TIMIT is a database of speech production data under ongoing development, which currently includes real-time magnetic resonance imaging data from five male and five female speakers of American English, and electromagnetic articulography data from three of these speakers. (Read more...)

CreativeIT Database

The CreativeIT database is an acted and multimodal database of dyadic theatrical improvisations. It contains 8 sessions of audiovisual data, including video, speech, and full-body motion capture data. (Read more...)