We have updated our privacy policy. If you have any question, contact us at privacy@scicrunch.org. Dismiss and don't show again

Searching across hundreds of databases

Our searching services are busy right now. Your search will reload in five seconds.

Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

Lateralized automatic auditory processing of phonetic versus musical information: a PET study.

Human brain mapping | Jun 4, 2000

Previous positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) studies show that during attentive listening, processing of phonetic information is associated with higher activity in the left auditory cortex than in the right auditory cortex while the opposite is true for musical information. The present PET study determined whether automatically activated neural mechanisms for phonetic and musical information are lateralized. To this end, subjects engaged in a visual word classification task were presented with phonetic sound sequences consisting of frequent (P = 0.8) and infrequent (P = 0.2) phonemes and with musical sound sequences consisting of frequent (P = 0.8) and infrequent (P = 0.2) chords. The phonemes and chords were matched in spectral complexity as well as in the magnitude of frequency difference between the frequent and infrequent sounds (/e/ vs. /o/; A major vs. A minor). In addition, control sequences, consisting of either frequent (/e/; A major) or infrequent sounds (/o/; A minor) were employed in separate blocks. When sound sequences consisted of intermixed frequent and infrequent sounds, automatic phonetic processing was lateralized to the left hemisphere and musical to the right hemisphere. This lateralization, however, did not occur in control blocks with one type of sound (frequent or infrequent). The data thus indicate that automatic activation of lateralized neuronal circuits requires sound comparison based on short-term sound representations.

Pubmed ID: 10864231 RIS Download

Mesh terms: Adult | Auditory Perception | Brain | Dominance, Cerebral | Humans | Male | Music | Phonetics | Tomography, Emission-Computed

Research resources used in this publication

None found

Research tools detected in this publication

None found

Data used in this publication

None found

Associated grants


Publication data is provided by the National Library of Medicine ® and PubMed ®. Data is retrieved from PubMed ® on a weekly schedule. For terms and conditions see the National Library of Medicine Terms and Conditions.

We have not found any resources mentioned in this publication.