Perception
Visual Perception
Social Perception
Speech Perception
Evidence for an eye-centered spherical representation of the visuomotor map. (1/4172)
During visually guided movement, visual coordinates of target location must be transformed into coordinates appropriate for movement. To investigate the representation of this visuomotor coordinate transformation, we examined changes in pointing behavior induced by a local visuomotor remapping. The visual feedback of finger position was limited to one location within the workspace, at which a discrepancy was introduced between the actual and visually perceived finger position. This remapping induced a change in pointing that extended over the entire workspace and was best captured by a spherical coordinate system centered near the eyes. (+info)Why and how is soft copy reading possible in clinical practice? (2/4172)
The properties of the human visual system (HVS) relevant to the diagnostic process are described after a brief introduction on the general problems and advantages of using soft copy for primary radiology interpretations. At various spatial and temporal frequencies the contrast sensitivity defines the spatial resolution of the eye-brain system and the sensitivity to flicker. The adaptation to the displayed radiological scene and the ambient illumination determine the dynamic range for the operation of the HVS. Although image display devices are determined mainly by state-of-the-art technology, analysis of the HVS may suggest technical characteristics for electronic displays that will help to optimize the display to the operation of the HVS. These include display size, spatial resolution, contrast resolution, luminance range, and noise, from which further consequences for the technical components of a monitor follow. It is emphasized that routine monitor quality control must be available in clinical practice. These image quality measures must be simple enough to be applied as part of the daily routine. These test instructions might also serve as elements of technical acceptance and constancy tests. (+info)Differential spatial memory impairment after right temporal lobectomy demonstrated using temporal titration. (3/4172)
In this study a temporal titration method to explore the extent to which spatial memory is differentially impaired following right temporal lobectomy was employed. The spatial and non-spatial memory of 19 left and 19 right temporal lobectomy (TL) patients was compared with that of 16 normal controls. The subjects studied an array of 16 toy objects and were subsequently tested for object recall, object recognition and memory for the location of the objects. By systematically varying the retention intervals for each group, it was possible to match all three groups on object recall at sub-ceiling levels. When memory for the position of the objects was assessed at equivalent delays, the right TL group revealed disrupted spatial memory, compared with both left TL and control groups (P < 0.05). MRI was used to quantify the extent of temporal lobe resection in the two groups and a significant correlation between hippocampal removal and both recall of spatial location and object name recall in the right TL group only was shown. These data support the notion of a selective (but not exclusive) spatial memory impairment associated with right temporal lobe damage that is related to the integrity of the hippocampal functioning. (+info)Unilateral neglect and disambiguation of the Necker cube. (4/4172)
Three groups of patients (right brain-damaged patients with or without left neglect, and left brain-damaged patients) and a group of healthy subjects, matched for age and educational level to the three groups of patients, were asked to report which of the two frontal surfaces of Necker cubes oriented in four different ways looked, at first sight, nearer to the viewer. The extent to which, and the way in which, disambiguation of the apparent perspective of Necker cubes occurred was found to vary across the four orientations and to be different in left-neglect patients compared with subjects of the other three groups. With normal subjects, the disambiguating factor is suggested to be a disposition to perceive the upper surface, which is nearly orthogonal to the frontal plane, as external to the cube. This would result from a navigation of the observer's spatial attention towards its target along a particular path that is altered in patients suffering from left neglect. It is suggested that comparison of the paths followed by the attentional vectors of normal subjects and left-neglect patients is potentially fruitful for a better understanding of the brain's normal mechanisms of spatial attention and of unresolved issues concerning the perception of the Necker cube. (+info)Spatial- and task-dependent neuronal responses during real and virtual translocation in the monkey hippocampal formation. (5/4172)
Neuropsychological data in humans demonstrated a pivotal role of the medial temporal lobe, including the hippocampal formation (HF) and the parahippocampal gyrus (PH), in allocentric (environment-centered) spatial learning and memory. In the present study, the functional significance of the monkey HF and PH neurons in allocentric spatial processing was analyzed during performance of the spatial tasks. In the tasks, the monkey either freely moved to one of four reward areas in the experimental field by driving a cab that the monkey rode (real translocation task) or freely moved a pointer to one of four reward areas on the monitor (virtual translocation task) by manipulating a joystick. Of 389 neurons recorded from the monkey HF and PH, 166 had place fields that displayed increased activity in a specific area in the experimental field and/or on the monitor (location-differential neurons). More HF and PH neurons responded in the real translocation task. These neurons had low mean spontaneous firing rates (0.96 spikes/sec), similar to those of rodent HF place cells. The remaining nonresponsive neurons had significantly higher mean firing rates (8. 39 spikes/sec), similar to interneurons or theta cells in the rodent HF. Furthermore, most location-differential neurons showed different responses in different tasks. These results suggest that the HF and PH are crucial in allocentric information processing and, moreover, that the HF can encode different reference frames that are context or task-dependent. This may be the neural basis of episodic memory. (+info)Crossmodal associative memory representations in rodent orbitofrontal cortex. (6/4172)
Firing patterns of neurons in the orbitofrontal cortex (OF) were analyzed in rats trained to perform a task that encouraged incidental associations between distinct odors and the places where their occurrence was detected. Many of the neurons fired differentially when the animals were at a particular location or sampled particular odors. Furthermore, a substantial fraction of the cells exhibited odor-specific firing patterns prior to odor presentation, when the animal arrived at a location associated with that odor. These findings suggest that neurons in the OF encode cross-modal associations between odors and locations within long-term memory. (+info)Functionally independent components of the late positive event-related potential during visual spatial attention. (7/4172)
Human event-related potentials (ERPs) were recorded from 10 subjects presented with visual target and nontarget stimuli at five screen locations and responding to targets presented at one of the locations. The late positive response complexes of 25-75 ERP average waveforms from the two task conditions were simultaneously analyzed with Independent Component Analysis, a new computational method for blindly separating linearly mixed signals. Three spatially fixed, temporally independent, behaviorally relevant, and physiologically plausible components were identified without reference to peaks in single-channel waveforms. A novel frontoparietal component (P3f) began at approximately 140 msec and peaked, in faster responders, at the onset of the motor command. The scalp distribution of P3f appeared consistent with brain regions activated during spatial orienting in functional imaging experiments. A longer-latency large component (P3b), positive over parietal cortex, was followed by a postmotor potential (Pmp) component that peaked 200 msec after the button press and reversed polarity near the central sulcus. A fourth component associated with a left frontocentral nontarget positivity (Pnt) was evoked primarily by target-like distractors presented in the attended location. When no distractors were presented, responses of five faster-responding subjects contained largest P3f and smallest Pmp components; when distractors were included, a Pmp component appeared only in responses of the five slower-responding subjects. Direct relationships between component amplitudes, latencies, and behavioral responses, plus similarities between component scalp distributions and regional activations reported in functional brain imaging experiments suggest that P3f, Pmp, and Pnt measure the time course and strength of functionally distinct brain processes. (+info)Space representation in unilateral spatial neglect. (8/4172)
Patients with unilateral brain lesions were given a task requiring exploration of space with the hand in order to assess the visual dependency of unilateral spatial neglect. The task was carried out both without visual control and under visual control. Performances were compared with that of normal subjects. Results were :(1) patients with right brain damage with no visual field defect demonstrated left-sided neglect only when the exploration was not controlled visually; (2) patients with left and right brain damage with visual field defect demonstrated contralateral neglect only when the exploration was under visual guidance. The performance of the patients with right brain damage without visual field defect in not clearly understood. The other results suggest that inner spatial representation remains intact in most cases of spatial neglect. The role of parietal lobe damage in the development of this visually induced phenomenon is hypothesised. The dominant position of vision among the senses is indicated. (+info)Space perception, in the context of neuroscience and psychology, refers to the ability to perceive and understand the spatial arrangement of objects and their relationship to oneself. It involves integrating various sensory inputs such as visual, auditory, tactile, and proprioceptive information to create a coherent three-dimensional representation of our environment.
This cognitive process enables us to judge distances, sizes, shapes, and movements of objects around us. It also helps us navigate through space, reach for objects, avoid obstacles, and maintain balance. Disorders in space perception can lead to difficulties in performing everyday activities and may be associated with neurological conditions such as stroke, brain injury, or neurodevelopmental disorders like autism.
In the context of medicine and psychology, perception refers to the neurophysiological processes, cognitive abilities, and psychological experiences that enable an individual to interpret and make sense of sensory information from their environment. It involves the integration of various stimuli such as sight, sound, touch, taste, and smell to form a coherent understanding of one's surroundings, objects, events, or ideas.
Perception is a complex and active process that includes attention, pattern recognition, interpretation, and organization of sensory information. It can be influenced by various factors, including prior experiences, expectations, cultural background, emotional states, and cognitive biases. Alterations in perception may occur due to neurological disorders, psychiatric conditions, sensory deprivation or overload, drugs, or other external factors.
In a clinical setting, healthcare professionals often assess patients' perceptions of their symptoms, illnesses, or treatments to develop individualized care plans and improve communication and adherence to treatment recommendations.
Visual perception refers to the ability to interpret and organize information that comes from our eyes to recognize and understand what we are seeing. It involves several cognitive processes such as pattern recognition, size estimation, movement detection, and depth perception. Visual perception allows us to identify objects, navigate through space, and interact with our environment. Deficits in visual perception can lead to learning difficulties and disabilities.
Motion perception is the ability to interpret and understand the movement of objects in our environment. It is a complex process that involves multiple areas of the brain and the visual system. In medical terms, motion perception refers to the specific function of the visual system to detect and analyze the movement of visual stimuli. This allows us to perceive and respond to moving objects in our environment, which is crucial for activities such as driving, sports, and even maintaining balance. Disorders in motion perception can lead to conditions like motion sickness or difficulty with depth perception.
Social perception, in the context of psychology and social sciences, refers to the ability to interpret and understand other people's behavior, emotions, and intentions. It is the process by which we make sense of the social world around us, by observing and interpreting cues such as facial expressions, body language, tone of voice, and situational context.
In medical terminology, social perception is not a specific diagnosis or condition, but rather a cognitive skill that can be affected in various mental and neurological disorders, such as autism spectrum disorder, schizophrenia, and dementia. For example, individuals with autism may have difficulty interpreting social cues and understanding other people's emotions and intentions, while those with schizophrenia may have distorted perceptions of social situations and interactions.
Healthcare professionals who work with patients with cognitive or neurological disorders may assess their social perception skills as part of a comprehensive evaluation, in order to develop appropriate interventions and support strategies.
Speech perception is the process by which the brain interprets and understands spoken language. It involves recognizing and discriminating speech sounds (phonemes), organizing them into words, and attaching meaning to those words in order to comprehend spoken language. This process requires the integration of auditory information with prior knowledge and context. Factors such as hearing ability, cognitive function, and language experience can all impact speech perception.
Depth perception is the ability to accurately judge the distance or separation of an object in three-dimensional space. It is a complex visual process that allows us to perceive the world in three dimensions and to understand the spatial relationships between objects.
Depth perception is achieved through a combination of monocular cues, which are visual cues that can be perceived with one eye, and binocular cues, which require input from both eyes. Monocular cues include perspective (the relative size of objects), texture gradients (finer details become smaller as distance increases), and atmospheric perspective (colors become less saturated and lighter in value as distance increases). Binocular cues include convergence (the degree to which the eyes must turn inward to focus on an object) and retinal disparity (the slight difference in the images projected onto the two retinas due to the slightly different positions of the eyes).
Deficits in depth perception can occur due to a variety of factors, including eye disorders, brain injuries, or developmental delays. These deficits can result in difficulties with tasks such as driving, sports, or navigating complex environments. Treatment for depth perception deficits may include vision therapy, corrective lenses, or surgery.