Increasing confidence in vergence as a cue to distance. (1/254)
Multiple cues contribute to the visual perception of an object's distance from the observer. The manner in which the nervous system combines these various cues is of considerable interest. Although it is accepted that image cues play a significant role in distance perception, controversy exists regarding the use of kinaesthetic information about the eyes' state of convergence. We used a perturbation technique to explore the contribution of vergence to visually based distance estimates as a function of both fixation distance and the availability of retinal information. Our results show that the nervous system increases the weighting given to vergence as (i) fixation distance becomes closer; and (ii) the available retinal image cues decrease. We also identified the presence of a strong contraction bias when distance cues were studied in isolation, but we argue that such biases do not suggest that vergence provides an ineffectual signal for near-space perception. (+info)Effects of viewing distance on the responses of horizontal canal-related secondary vestibular neurons during angular head rotation. (2/254)
Effects of viewing distance on the responses of horizontal canal-related secondary vestibular neurons during angular head rotation. The eye movements generated by the horizontal canal-related angular vestibuloocular reflex (AVOR) depend on the distance of the image from the head and the axis of head rotation. The effects of viewing distance on the responses of 105 horizontal canal-related central vestibular neurons were examined in two squirrel monkeys that were trained to fixate small, earth-stationary targets at different distances (10 and 150 cm) from their eyes. The majority of these cells (77/105) were identified as secondary vestibular neurons by synaptic activation following electrical stimulation of the vestibular nerve. All of the viewing distance-sensitive units were also sensitive to eye movements in the absence of head movements. Some classes of eye movement-related vestibular units were more sensitive to viewing distance than others. For example, the average increase in rotational gain (discharge rate/head velocity) of position-vestibular-pause units was 20%, whereas the gain increase of eye-head-velocity units was 44%. The concomitant change in gain of the AVOR was 11%. Near viewing responses of units phase lagged the responses they generated during far target viewing by 6-25 degrees. A similar phase lag was not observed in either the near AVOR eye movements or in the firing behavior of burst-position units in the vestibular nuclei whose firing behavior was only related to eye movements. The viewing distance-related increase in the evoked eye movements and in the rotational gain of all unit classes declined progressively as stimulus frequency increased from 0.7 to 4.0 Hz. When monkeys canceled their VOR by fixating head-stationary targets, the responses recorded during near and far target viewing were comparable. However, the viewing distance-related response changes exhibited by central units were not directly attributable to the eye movement signals they generated. Subtraction of static eye position signals reduced, but did not abolish viewing distance gain changes in most units. Smooth pursuit eye velocity sensitivity and viewing distance sensitivity were not well correlated. We conclude that the central premotor pathways that mediate the AVOR also mediate viewing distance-related changes in the reflex. Because irregular vestibular nerve afferents are necessary for viewing distance-related gain changes in the AVOR, we suggest that a central estimate of viewing distance is used to parametrically modify vestibular afferent inputs to secondary vestibuloocular reflex pathways. (+info)Effects of viewing distance on the responses of vestibular neurons to combined angular and linear vestibular stimulation. (3/254)
Effects of viewing distance on the responses of vestibular neurons to combined angular and linear vestibular stimulation. The firing behavior of 59 horizontal canal-related secondary vestibular neurons was studied in alert squirrel monkeys during the combined angular and linear vestibuloocular reflex (CVOR). The CVOR was evoked by positioning the animal's head 20 cm in front of, or behind, the axis of rotation during whole body rotation (0.7, 1.9, and 4.0 Hz). The effect of viewing distance was studied by having the monkeys fixate small targets that were either near (10 cm) or far (1.3-1.7 m) from the eyes. Most units (50/59) were sensitive to eye movements and were monosynaptically activated after electrical stimulation of the vestibular nerve (51/56 tested). The responses of eye movement-related units were significantly affected by viewing distance. The viewing distance-related change in response gain of many eye-head-velocity and burst-position units was comparable with the change in eye movement gain. On the other hand, position-vestibular-pause units were approximately half as sensitive to changes in viewing distance as were eye movements. The sensitivity of units to the linear vestibuloocular reflex (LVOR) was estimated by subtraction of angular vestibuloocular reflex (AVOR)-related responses recorded with the head in the center of the axis of rotation from CVOR responses. During far target viewing, unit sensitivity to linear translation was small, but during near target viewing the firing rate of many units was strongly modulated. The LVOR responses and viewing distance-related LVOR responses of most units were nearly in phase with linear head velocity. The signals generated by secondary vestibular units during voluntary cancellation of the AVOR and CVOR were comparable. However, unit sensitivity to linear translation and angular rotation were not well correlated either during far or near target viewing. Unit LVOR responses were also not well correlated with their sensitivity to smooth pursuit eye movements or their sensitivity to viewing distance during the AVOR. On the other hand there was a significant correlation between static eye position sensitivity and sensitivity to viewing distance. We conclude that secondary horizontal canal-related vestibuloocular pathways are an important part of the premotor neural substrate that produces the LVOR. The otolith sensory signals that appear on these pathways have been spatially and temporally transformed to match the angular eye movement commands required to stabilize images at different distances. We suggest that this transformation may be performed by the circuits related to temporal integration of the LVOR. (+info)Long range interactions between oriented texture elements. (4/254)
Long range interactions between texture elements (short, oriented line segments) were examined. Specifically, we studied the influence of a background array of texture elements on the detectability of a target element (separated from the background by an intermediate textured region) using textures like those of Caputo (Vis. Res. 1996, 36, 2815-2826). We found that, in general, when the background elements were oriented orthogonally to the target element, detection of the target element was better than when the background elements had the same orientation as the target element. We discuss these interactions in terms of inhibitory and excitatory connections between orientation and spatial frequency selective linear filters (e.g. filters which mimic V1 simple cells) which would respond to the individual texture elements. (+info)Perceived distance, shape and size. (5/254)
If distance, shape and size are judged independently from the retinal and extra-retinal information at hand, different kinds of information can be expected to dominate each judgement, so that errors in one judgement need not be consistent with errors in other judgements. In order to evaluate how independent these three judgments are, we examined how adding information that improves one judgement influences the others. Subjects adjusted the size and the global shape of a computer-simulated ellipsoid to match a tennis ball. They then indicated manually where they judged the simulated ball to be. Adding information about distance improved the three judgements in a consistent manner, demonstrating that a considerable part of the errors in all three judgements were due to misestimating the distance. Adding information about shape that is independent of distance improved subjects' judgements of shape, but did not influence the set size or the manually indicated distance. Thus, subjects ignored conflicts between the cues when judging the shape, rather than using the conflicts to improve their estimate of the ellipsoid's distance. We conclude that the judgements are quite independent, in the sense that no attempt is made to attain consistency, but that they do rely on some common measures, such as that of distance. (+info)Driver distance from the steering wheel: perception and objective measurement. (6/254)
OBJECTIVES: This study assessed the accuracy of driver perceptions of the distance between the driver's nose and the steering wheel of the vehicle as a factor in considering driver disconnection of an airbag contained in the steering wheel for preventing injury to the driver in an accident. METHODS: A cross-sectional survey of 1000 drivers was done to obtain perceived and objective measurements of the distance between the driver's nose and the steering wheel of the vehicle. RESULTS: Of 234 drivers who believed that they sat within 12 inches of the steering wheel, only 8 (3%) actually did so, whereas of 658 drivers who did not believe that they sat within 12 inches of the wheel, 14 (2%) did so. Shorter drivers were more likely than taller ones to both underestimate and overestimate their seating distance. CONCLUSIONS: Considerable misperception of drivers' distance from the wheel indicates that drivers should objectively measure this distance. (+info)Explaining the moon illusion. (7/254)
An old explanation of the moon illusion holds that various cues place the horizon moon at an effectively greater distance than the elevated moon. Although both moons have the same angular size, the horizon moon must be perceived as larger. More recent explanations hold that differences in accommodation or other factors cause the elevated moon to appear smaller. As a result of this illusory difference in size, the elevated moon appears to be more distant than the horizon moon. These two explanations, both based on the geometry of stereopsis, lead to two diametrically opposed hypotheses. That is, a depth interval at a long distance is associated with a smaller binocular disparity, whereas an equal depth interval at a smaller distance is associated with a larger disparity. We conducted experiments involving artificial moons and confirmed the hypothesis that the horizon moon is at a greater perceptual distance. Moreover, when a moon of constant angular size was moved closer it was also perceived as growing smaller, which is consistent with the older explanation. Although Emmert's law does not predict the size-distance relationship over long distances, we conclude that the horizon moon is perceived as larger because the perceptual system treats it as though it is much farther away. Finally, we observe that recent explanations substitute perceived size for angular size as a cue to distance. Thus, they imply that perceptions cause perceptions. (+info)Honeybee navigation: nature and calibration of the "odometer". (8/254)
There are two theories about how honeybees estimate the distance to food sources. One theory proposes that distance flown is estimated in terms of energy consumption. The other suggests that the cue is visual, and is derived from the extent to which the image of the world has moved on the eye during the trip. Here the two theories are tested by observing dances of bees that have flown through a short, narrow tunnel to collect a food reward. The results show that the honeybee's "odometer" is visually driven. They also provide a calibration of the dance and the odometer in visual terms. (+info)Distance perception refers to the ability to accurately judge the distance or depth of an object in relation to oneself or other objects. It is a complex process that involves both visual and non-visual cues, such as perspective, size, texture, motion parallax, binocular disparity, and familiarity with the object or scene.
In the visual system, distance perception is primarily mediated by the convergence of the two eyes on an object, which provides information about its depth and location in three-dimensional space. The brain then integrates this information with other sensory inputs and prior knowledge to create a coherent perception of the environment.
Disorders of distance perception can result from various conditions that affect the visual system, such as amblyopia, strabismus, or traumatic brain injury. These disorders can cause difficulties in tasks that require accurate depth perception, such as driving, sports, or manual work.