Facial expressions, their communicatory functions and neuro-cognitive substrates. (49/1368)

Human emotional expressions serve a crucial communicatory role allowing the rapid transmission of valence information from one individual to another. This paper will review the literature on the neural mechanisms necessary for this communication: both the mechanisms involved in the production of emotional expressions and those involved in the interpretation of the emotional expressions of others. Finally, reference to the neuro-psychiatric disorders of autism, psychopathy and acquired sociopathy will be made. In these conditions, the appropriate processing of emotional expressions is impaired. In autism, it is argued that the basic response to emotional expressions remains intact but that there is impaired ability to represent the referent of the individual displaying the emotion. In psychopathy, the response to fearful and sad expressions is attenuated and this interferes with socialization resulting in an individual who fails to learn to avoid actions that result in harm to others. In acquired sociopathy, the response to angry expressions in particular is attenuated resulting in reduced regulation of social behaviour.  (+info)

Complex movements evoked by microstimulation of the ventral intraparietal area. (50/1368)

Most neurons in the ventral intraparietal area (VIP) of the macaque brain respond to both visual and tactile stimuli. The tactile receptive field is usually on the face, and the visual receptive field usually corresponds spatially to the tactile receptive field. In this study, electrical microstimulation of VIP, but not of surrounding tissue, caused a constellation of movements including eye closure, facial grimacing, head withdrawal, elevation of the shoulder, and movements of the hand to the space beside the head or shoulder. A similar set of movements was evoked by an air puff to the monkey's cheek. One interpretation is that VIP contributes to defensive movements triggered by stimuli on or near the head.  (+info)

Specific brain processing of facial expressions in people with alexithymia: an H2 15O-PET study. (51/1368)

Alexithymia is a personal trait characterized by a reduced ability to identify and describe one's own feelings and is known to contribute to a variety of physical and behavioural disorders. To elucidate the pathogenesis of stress-related disorders and the normal functions of emotion, it is important to investigate the neurobiology of alexithymia. Although several neurological models of alexithymia have been proposed, there is very little direct evidence for the neural correlates of alexithymia. Using PET, we studied brain activity in subjects with alexithymia when viewing a range of emotional face expressions. Twelve alexithymic and 12 non-alexithymic volunteers (all right-handed males) were selected from 247 applicants on the basis of the 20-item Toronto Alexithymia Scale (TAS-20). Regional cerebral blood flow (rCBF) was measured with H(2)(15)O-PET while the subjects looked at angry, sad and happy faces with varying emotional intensity, as well as neutral faces. Brain response in the subjects with alexithymia significantly differed from that in the subjects without alexithymia. The alexithymics exhibited lower rCBF in the inferior and middle frontal cortex, orbitofrontal cortex, inferior parietal cortex and occipital cortex in the right hemisphere than the non-alexithymics. Additionally, the alexithymics showed higher rCBF in the superior frontal cortex, inferior parietal cortex and cerebellum in the left hemisphere when compared with the non-alexithymics. A covariance analysis revealed that rCBF in the inferior and superior frontal cortex, orbitofrontal cortex and parietal cortex in the right hemisphere correlated negatively with individual TAS-20 scores when viewing angry and sad facial expressions, and that no rCBF correlated positively with TAS-20 scores. Moreover, the anterior cingulate cortex and insula were less activated in the alexithymics' response to angry faces than their response to neutral faces. These results suggest that people with alexithymia process facial expressions differently from people without alexithymia, and that this difference may account for the disorder of affect regulation and consequent peculiar behaviour in people with alexithymia.  (+info)

Habituation of rostral anterior cingulate cortex to repeated emotionally salient pictures. (52/1368)

Habituation of the neural response to repeated stimuli has been well demonstrated for subcortical limbic regions responding to emotionally salient stimuli. Although the rostral or affective division of the anterior cingulate cortex (rACC) is also engaged during emotional processing, little is known about the temporal dynamics of this region in sustained evaluation of emotional salience. Using a test/retest design, the present study assessed habituation in the human brain with functional magnetic resonance imaging. Eight healthy subjects were exposed to two repeated runs of aversive, neutral, and blank images. Activation of the rACC to negatively valenced pictures occurred only in the first session, and this activation was significantly greater in the first relative to the second session. Additionally, medial prefrontal cortex, hippocampal, and amygdalar activations were noted during the first, but not second, presentation of aversive pictures. These findings highlight the phasic activity of the rACC in emotional processing consistent with habituation.  (+info)

Changes in emotion after circumscribed surgical lesions of the orbitofrontal and cingulate cortices. (53/1368)

To analyse the functions of different parts of the prefrontal cortex in emotion, patients with different prefrontal surgical excisions were compared on four measures of emotion: voice and face emotional expression identification, social behaviour, and the subjective experience of emotion. Some patients with bilateral lesions of the orbitofrontal cortex (OFC) had deficits in voice and face expression identification, and the group had impairments in social behaviour and significant changes in their subjective emotional state. Some patients with unilateral damage restricted to the OFC also had deficits in voice expression identification, and the group did not have significant changes in social behaviour or in their subjective emotional state. Patients with unilateral lesions of the antero-ventral part of the anterior cingulate cortex (ACC) and/or medial Brodmann area (BA) 9 were, in some cases, impaired on voice and face expression identification, had some change in social behaviour, and had significant changes in their subjective emotional state. Patients with unilateral lesions of the OFC and of the ACC and/or medial BA 9 were, in some cases, impaired on voice and face expression identification, had some changes in social behaviour, and had significant changes in their subjective emotional state. Patients with dorsolateral prefrontal cortex lesions or with medial lesions outside ACC and medial BA 9 areas (dorsolateral/other medial group) were unimpaired on any of these measures of emotion. In all cases in which voice expression identification was impaired, there were no deficits in control tests of the discrimination of unfamiliar voices and the recognition of environmental sounds. Thus bilateral or unilateral lesions circumscribed surgically within the OFC can impair emotional voice and/or face expression identification, but significant changes in social behaviour and in subjective emotional state are related to bilateral lesions. Importantly, unilateral lesions of the ACC (including some of medial BA 9) can produce voice and/or face expression identification deficits, and marked changes in subjective emotional state. These findings with surgically circumscribed lesions show that within the prefrontal cortex, both the OFC and the ACC/medial BA 9 region are involved in a number of aspects of emotion in humans including emotion identification, social behaviour and subjective emotional state, and that the dorsolateral prefrontal areas are not involved in emotion in these ways.  (+info)

Electrophysiological and haemodynamic correlates of face perception, recognition and priming. (54/1368)

Face perception, recognition and priming were examined with event-related functional magnetic resonance imaging (fMRI) and scalp event-related potentials (ERPs). Face perception was associated with haemodynamic increases in regions including bilateral fusiform and right superior temporal cortices, and a right posterior negativity (N170), most likely generated in the superior temporal region. Face recognition was associated with haemodynamic increases in fusiform, medial frontal and orbitofrontal cortices, and with a frontocentral positivity from 550 ms poststimulus. Face repetition was associated with a positivity from 400 to 600 ms and behavioural priming. Repetition of familiar faces was also associated with earlier onset of the ERP familiarity effect, and haemodynamic decreases in fusiform cortex. These data support a multi-component model of face-processing, with priming arising from more than one stage.  (+info)

The use of facial motion and facial form during the processing of identity. (55/1368)

Previous research has shown that facial motion can carry information about age, gender, emotion and, at least to some extent, identity. By combining recent computer animation techniques with psychophysical methods, we show that during the computation of identity the human face recognition system integrates both types of information: individual non-rigid facial motion and individual facial form. This has important implications for cognitive and neural models of face perception, which currently emphasize a separation between the processing of invariant aspects (facial form) and changeable aspects (facial motion) of faces.  (+info)

Contextual determinants of anger and other negative expressions in young infants. (56/1368)

Two experiments examined how different frustration contexts affect the instrumental and emotional responses of 4- to 5-month-old infants. Three different frustrating contexts were investigated: loss of stimulation (extinction), reduction in contingent stimulation (partial reinforcement), and loss of stimulus control (noncontingency). In both experiments, changes in arm activity and facial expressions of anger and sadness coded according to the Maximally Discriminative Facial Movement Coding System (MAX) were the measures of frustration. Both experiments showed that (a) arm responses increased when the contingent stimulus was lost or reduced but decreased when control of the stimulus was lost under noncontingency, (b) MAX-coded anger, but not MAX-coded sad or blends of anger and sad, was associated with frustration, and (c) the pattern of anger and arm responses varied with the frustration context. When contingent stimulation was lost or reduced, both anger and arm responses increased, but when expected control was lost under noncontingency, arm responses decreased while anger increased.  (+info)