The everyday act of speaking involves the complex processes of speech motor control. One important feature of such control is regulation of articulation when auditory concomitants of speech do not correspond to the intended motor gesture. While theoretical accounts of speech monitoring posit multiple functional components required for detection of errors in speech planning (e.g., Levelt, 1983), neuroimaging studies generally indicate either single brain regions sensitive to speech production errors, or small, discrete networks. Here we demonstrate that the complex system controlling speech is supported by a complex neural network that is involved in linguistic, motoric and sensory processing. With the aid of novel real-time acoustic analyses and representational similarity analyses of fMRI signals, our data show functionally differentiated networks underlying auditory feedback control of speech.
|Number of pages||3|
|Publication status||Published - 2012|
|Event||18th Annual Meeting of the Organization for Human Brain Mapping - China National Convention Center , Beijing, China|
Duration: 10 Jun 2012 → 14 Jun 2012
|Conference||18th Annual Meeting of the Organization for Human Brain Mapping|
|Location||China National Convention Center|
|Period||10/06/2012 → 14/06/2012|