Auditory speech perception
We are interested in how the motor and auditory cortex interact during speech perception. Transcranial magnetic stimulation provides a powerful tool to investigate these interactions (see our recent review paper: Möttönen & Watkins, 2012). In many of our studies we apply single-pulse TMS over the lip area of the left primary motor cortex to elicit motor evoked potentials (MEPs) in the lip muscle. These lip MEPs are larger during visual and auditory speech perception compared to rest (Watkins et al., 2003). This shows that excitability of the articulatory motor cortex is enhanced during speech perception. We are currently investigating motor excitability during listening to speech in noise and during audiovisual speech perception.
Using low-frequency repetitive TMS (rTMS) we can induce a temporary disruption in the motor lip area. We have shown that this disruption impairs listeners’ performance in categorical speech perception tasks that involve lip-articulated speech sounds (e.g. /ba/ and /pa/) (Möttönen & Watkins, 2009).
In order to understand better how the motor cortex contributes to speech perception, we are using TMS in combination with electroencepalography (EEG), magnetoencephalography (MEG) and functional MRI. Our recent study showed that TMS-induced disruption of the articulatory motor cortex suppresses automatic EEG responses to changes in speech sounds, but not to changes in piano tones (Möttönen et al., 2013). This finding provides evidence that the auditory and motor cortex interact during speech processing.
Visual speech perception
During face-to-face communication a speaker’s articulatory movements are visible. It is possible to understand speech to some extent by “lipreading”. We are investigating the brain mechanisms of “lipreading” using brain stimulation and behavioural tests.
Using TMS, we found that excitability of the articulatory motor cortex was higher during observation of known speech (English) than unknown speech (Hebrew) or non-speech mouth movements in both native and non-native speakers of English (Swaminathan et al., 2013). Both native and non-native English speakers were able to discriminate the known language from the unknown language on the basis of visual information alone. This data provides further evidence for sensorimotor processing of visual signals that are used in speech communication. The experiment was carried out in collaboration with Dr. Mairead MacSweeney from University College London.
Möttönen R & Watkins KE (2009) Motor representations of articulators contribute to categorical perception of speech sounds. Journal of Neuroscience, 29(31), 9818-25.
Möttönen R, Dutton, R. & Watkins KE (2013). Auditory-motor processing of speech sounds. Cerebral Cortex. doi:10.1093/cercor/bhs110
Möttönen R & Watkins K (2012). Using TMS to study the role of the articulatory motor system in speech perception. Aphasiology, 26(9), 1103-1118.
Swaminathan, S., MacSweeney, M., Boyles, R., Waters, D., Watkins, K. E. Möttönen, R. (2013). Motor excitability during visual perception of known and unknown spoken languages. Brain and Language.
Watkins K, Strafella A, and Paus T (2003). Seeing and hearing speech excites the motor system involved in speech production. Neuropsychologia, 41(8), 989-94.
Watkins K, and Paus T. (2004) Modulation of motor excitability during speech perception: the role of Broca's area. J Cogn Neurosci. 16(6): 978-87.