We are pleased to release an Electromagnetic Articulography (EMA) database to the community for the study of expressive speech. This database contains a total of 680 utterances spoken in four differenct target emotions, such as anger, happiness, sadness and neutrality. This database was designed for the joint study of both the acoustic domain and the articulatory domain of emotional speech, but it can also be used for other research areas related emotional speech, speech production, etc.. It was collected at SAIL laboratory at the University of Southern California in 2005.
Speech audio files, their emotional states evaluation (by listeners) files and corresponding articulatory motion capture files (aligned with speech waveforms) are included in this database. Articulatory movement measurements include position, velocity and acceleration values of the tongue tip, lower lip and jaw. There are three native speakers of American English: two females and one male. The two female talkers produced 10 sentences, and the male produced 14 sentences (10 sentences overlap those of the female speakers). Each sentence was repeated 5 times for each of the four different emotions. This database was evaluated by multiple listeners for (1) categorical emotions, such as anger, happiness, sadness and neutrality with the confidence rating (numeric scale) for each emotion category and (2) perceptional dimensions (valence, activation and dominance).
2010 Signal Analysis and Interpretation Laboratory
3710 S. McClintock Ave, RTH 320
Los Angeles, CA 90089, U.S.A