The cerebral haemodynamics of music perception. A transcranial Doppler sonography study. (1/850)

The perception of music has been investigated by several neurophysiological and neuroimaging methods. Results from these studies suggest a right hemisphere dominance for non-musicians and a possible left hemisphere dominance for musicians. However, inconsistent results have been obtained, and not all variables have been controlled by the different methods. We performed a study with functional transcranial Doppler sonography (fTCD) of the middle cerebral artery to evaluate changes in cerebral blood flow velocity (CBFV) during different periods of music perception. Twenty-four healthy right-handed subjects were enrolled and examined during rest and during listening to periods of music with predominant language, rhythm and harmony content. The gender, musical experience and mode of listening of the subjects were chosen as independent factors; the type of music was included as the variable in repeated measurements. We observed a significant increase of CBFV in the right hemisphere in non-musicians during harmony perception but not during rhythm perception; this effect was more pronounced in females. Language perception was lateralized to the left hemisphere in all subject groups. Musicians showed increased CBFV values in the left hemisphere which were independent of the type of stimulus, and background listeners showed increased CBFV values during harmony perception in the right hemisphere which were independent of their musical experience. The time taken to reach the peak of CBFV was significantly longer in non-musicians when compared with musicians during rhythm and harmony perception. Pulse rates were significantly decreased in non-musicians during harmony perception, probably due to a specific relaxation effect in this subgroup. The resistance index did not show any significant differences, suggesting only regional changes of small resistance vessels but not of large arteries. Our fTCD study confirms previous findings of right hemisphere lateralization for harmony perception in non-musicians. In addition, we showed that this effect is more pronounced in female subjects and in background listeners and that the lateralization is delayed in non-musicians compared with musicians for the perception of rhythm and harmony stimuli. Our data suggest that musicians and non-musicians have different strategies to lateralize musical stimuli, with a delayed but marked right hemisphere lateralization during harmony perception in non-musicians and an attentive mode of listening contributing to a left hemisphere lateralization in musicians.  (+info)

Is integer arithmetic fundamental to mental processing?: the mind's secret arithmetic. (2/850)

Unlike the ability to acquire our native language, we struggle to learn multiplication and division. It may then come as a surprise that the mental machinery for performing lightning-fast integer arithmetic calculations could be within us all even though it cannot be readily accessed, nor do we have any idea of its primary function. We are led to this provocative hypothesis by analysing the extraordinary skills of autistic savants. In our view such individuals have privileged access to lower levels of information not normally available through introspection.  (+info)

When that tune runs through your head: a PET investigation of auditory imagery for familiar melodies. (3/850)

The present study used positron emission tomography (PET) to examine the cerebral activity pattern associated with auditory imagery for familiar tunes. Subjects either imagined the continuation of nonverbal tunes cued by their first few notes, listened to a short sequence of notes as a control task, or listened and then reimagined that short sequence. Subtraction of the activation in the control task from that in the real-tune imagery task revealed primarily right-sided activation in frontal and superior temporal regions, plus supplementary motor area (SMA). Isolating retrieval of the real tunes by subtracting activation in the reimagine task from that in the real-tune imagery task revealed activation primarily in right frontal areas and right superior temporal gyrus. Subtraction of activation in the control condition from that in the reimagine condition, intended to capture imagery of unfamiliar sequences, revealed activation in SMA, plus some left frontal regions. We conclude that areas of right auditory association cortex, together with right and left frontal cortices, are implicated in imagery for familiar tunes, in accord with previous behavioral, lesion and PET data. Retrieval from musical semantic memory is mediated by structures in the right frontal lobe, in contrast to results from previous studies implicating left frontal areas for all semantic retrieval. The SMA seems to be involved specifically in image generation, implicating a motor code in this process.  (+info)

Musical rhythms in heart period dynamics: a cross-cultural and interdisciplinary approach to cardiac rhythms. (4/850)

The purpose of this study was to expand classic heart period analysis methods by techniques from ethnomusicology that explicitly take complex musical rhythm principles into consideration. The methods used are based on the theory of African music, the theory of symbolic dynamics, and combinatorial theory. Heart period tachograms from 192 24-h electrocardiograms of 96 healthy subjects were transformed into binary symbol sequences that were interpretable as elementary rhythmic (percussive) patterns, the time lines in African music. Using a hierarchical rhythm pattern scheme closely related to the Derler Rhythm Classification (from jazz theory), we calculated the predominance and stability of pattern classes. The results show that during sleep certain classes, specific to individuals, occurred in a cyclically recurrent manner and many times more often than expected. Simultaneously, other classes disappeared more or less completely. Moreover, the most frequent classes obviously originate from phase-locking processes in autonomic regulation (e.g., between respiratory and cardiac cycles). In conclusion, the new interdisciplinary method presented here demonstrates that heart period patterns, in particular those occurring during night sleep, can be interpreted as musical rhythms. This method may be of great potential use in music therapy research.  (+info)

The perception of visual images encoded in musical form: a study in cross-modality information transfer. (5/850)

This study demonstrates the ability of blind (previously sighted) and blindfolded (sighted) subjects in reconstructing and identifying a number of visual targets transformed into equivalent musical representations. Visual images are deconstructed through a process which selectively segregates different features of the image into separate packages. These are then encoded in sound and presented as a polyphonic musical melody which resembles a Baroque fugue with many voices, allowing subjects to analyse the component voices selectively in combination, or separately in sequence, in a manner which allows a subject to patch together and bind the different features of the object mentally into a mental percept of a single recognizable entity. The visual targets used in this study included a variety of geometrical figures, simple high-contrast line drawings of man-made objects, natural and urban scenes, etc., translated into sound and presented to the subject in polyphonic musical form.  (+info)

The effects of skill on the eye-hand span during musical sight-reading. (6/850)

The eye-hand span (EHS) is the separation between eye position and hand position when sight-reading music. It can be measured in two ways: in notes (the number of notes between hand and eye; the 'note index'), or in time (the length of time between fixation and performance; the 'time index'). The EHSs of amateur and professional pianists were compared while they sight-read music. The professionals showed significantly larger note indexes than the amateurs (approximately four notes, compared to two notes), and all subjects showed similar variability in the note index. Surprisingly, the different groups of pianists showed almost identical mean time indexes (ca. 1 s), with no significant differences between any of the skill levels. However, professionals did show significantly less variation than the amateurs. The time index was significantly affected by the performance tempo: when fast tempos were imposed on performance, all subjects showed a reduction in the time index (to ca. 0.7 s), and slow tempos increased the time index (to ca. 1.3 s). This means that the length of time that information is stored in the buffer is related to performance tempo rather than ability, but that professionals can fit more information into their buffers.  (+info)

Receptive amusia: evidence for cross-hemispheric neural networks underlying music processing strategies. (7/850)

Perceptual musical functions were investigated in patients suffering from unilateral cerebrovascular cortical lesions. Using MIDI (Musical Instrument Digital Interface) technique, a standardized short test battery was established that covers local (analytical) as well as global perceptual mechanisms. These represent the principal cognitive strategies in melodic and temporal musical information processing (local, interval and rhythm; global, contour and metre). Of the participating brain-damaged patients, a total of 69% presented with post-lesional impairments in music perception. Left-hemisphere-damaged patients showed significant deficits in the discrimination of local as well as global structures in both melodic and temporal information processing. Right-hemisphere-damaged patients also revealed an overall impairment of music perception, reaching significance in the temporal conditions. Detailed analysis outlined a hierarchical organization, with an initial right-hemisphere recognition of contour and metre followed by identification of interval and rhythm via left-hemisphere subsystems. Patterns of dissociated and associated melodic and temporal deficits indicate autonomous, yet partially integrated neural subsystems underlying the processing of melodic and temporal stimuli. In conclusion, these data contradict a strong hemispheric specificity for music perception, but indicate cross-hemisphere, fragmented neural substrates underlying local and global musical information processing in the melodic and temporal dimensions. Due to the diverse profiles of neuropsychological deficits revealed in earlier investigations as well as in this study, individual aspects of musicality and musical behaviour very likely contribute to the definite formation of these widely distributed neural networks.  (+info)

Intersensory redundancy guides attentional selectivity and perceptual learning in infancy. (8/850)

This study assessed an intersensory redundancy hypothesis, which holds that in early infancy information presented redundantly and in temporal synchrony across two sense modalities selectively recruits attention and facilitates perceptual differentiation more effectively than does the same information presented unimodally. Five-month-old infants' sensitivity to the amodal property of rhythm was examined in 3 experiments. Results revealed that habituation to a bimodal (auditory and visual) rhythm resulted in discrimination of a novel rhythm, whereas habituation to the same rhythm presented unimodally (auditory or visual) resulted in no evidence of discrimination. Also, temporal synchrony between the bimodal auditory and visual information was necessary for rhythm discrimination. These findings support an intersensory redundancy hypothesis and provide further evidence for the importance of redundancy for guiding and constraining early perceptual learning.  (+info)