University of Southern California


login

Dynamics of Consonant Production

We are using Real-time MRI to investigate how different consonants are produced in different languages. Real-time MRI reveals how the tongue, lips, velum and glottis move, and how these organs are coordinated when a speaker articulates a consonant in different vowel contexts. The demonstration videos provide mid-sagittal views of two speakers producing all of the consonants in the IPA chart.

Phonetics of Singing

We are using Real-time MRI to study different types of singing, including Western Classical Soprano and Human Beatboxing performance, to investigate:

  • how human vocal organs are utilized in different performance styles.
  • how this articulation resembles/differs from that of spoken speech.
  • how percussive and linguistic gestures are coordinated.
  • how we perceive different signals are musical or linguistic in nature.

More information about this work can be found in the paper:

Michael Proctor, Erik Bresch, Dani Byrd, Krishna Nayak, and Shrikanth Narayanan. Paralinguistic Mechanisms of Production in Human 'Beatboxing:' a Real-time Magnetic Resonance Imaging Study. J. Acoust. Soc. Am. 133(2). 2013. and the accompanying multimedia site.

The video The Diva and the MC -- singing research using Real-time MRI, shows some examples of the data. The Diva and the MC -- singing research using Real-time MRI:

Speech Errors

Speech errors ('slips of the tongue'), offer important insights into speech production and phonological organization. Speech errors are not random -- they typically resemble other speech events, and appear to be systematic in their occurrence and distribution.

Traditionally, speech error studies have relied on transcriptions, which are limited in several respects: they are subject to the perceptual biases of the experimenter, cannot detect gradient errors, and are restricted to phenomena with clear acoustic correlates.

We are using Real-time MRI to investigate the articulatory properties of speech errors. Anomalous motor activity, which is not always audible, can be detected in these data -- for example in articulatory intrusions and shifts in the coordination of different articulators -- which suggests that certain types of speech error may be better understood as being gestural in nature.