Evaluating the effects of functional communication training in the presence and absence of establishing operations.
We conducted functional analyses of aberrant behavior with 4 children with developmental disabilities. We then implemented functional communication training (FCT) by using different mands across two contexts, one in which the establishing operation (EO) that was relevant to the function of aberrant behavior was present and one in which the EO that was relevant to the function of aberrant behavior was absent. The mand used in the EO-present context served the same function as aberrant behavior, and the mand used in the EO-absent context served a different function than the one identified via the functional analysis. In addition, a free-play (control) condition was conducted for all children. Increases in relevant manding were observed in the EO-present context for 3 of the 4 participants. Decreases in aberrant behavior were achieved by the end of the treatment analysis for all 4 participants. Irrelevant mands were rarely observed in the EO-absent context for 3 of the 4 participants. Evaluating the effectiveness of FCT across different contexts allowed a further analysis of manding when the establishing operations were present or absent. The contributions of this study to the understanding of functional equivalence are also discussed. (+info)
Using augmentative communication with infants and young children with Down syndrome.
This paper reports the use of two forms of augmentative and alternative communication (AAC) with young children with Down syndrome: a program using signing (Makaton), and the COMPIC system of computerised pictographs. Children with Down syndrome are frequently reported to have difficulties in the area of language and communication, with relative strengths in visual and perceptual areas. This suggests possible benefits from the use of AAC systems to enhance language development. The paper discusses the use of AAC systems to assist young children with Down syndrome, and reports an experimental study of the use of such systems with an object naming task. (+info)
Cortical representation of sign language: comparison of deaf signers and hearing non-signers.
Numerous studies have demonstrated activation of the classical left-hemisphere language areas when native signers process sign language. More recently, specific sign language-related processing has been suggested to occur in homologous areas of the right hemisphere as well. We now show that these cortical areas are also activated in hearing non-signers during passive viewing of signs that for them are linguistically meaningless. Neuromagnetic activity was stronger in deaf signers than in hearing non-signers in the region of the right superior temporal sulcus and the left dorsal premotor cortex, probably reflecting familiarity and linguistic meaningfulness of the observed movement sequences. In contrast, the right superior parietal lobule, the mesial parieto-occipital region, and the mesial paracentral lobule were more strongly activated in hearing non-signers, apparently reflecting active visuomotor encoding of complex unfamiliar movement sequences. (+info)
The neural organization of discourse: an H2 15O-PET study of narrative production in English and American sign language.
In order to identify brain regions that play an essential role in the production of discourse, H2 15O-PET scans were acquired during spontaneous generation of autobiographical narratives in English and in American Sign Language in hearing subjects who were native users of both. We compared languages that differ maximally in their mode of expression yet share the same core linguistic properties in order to differentiate the stages of discourse production: differences between the languages should reflect later, modality-dependent stages of phonological encoding and articulation; congruencies are more likely to reveal the anatomy of earlier modality-independent stages of conceptualization and lexical access. Common activations were detected in a widespread array of regions; left hemisphere language areas classically related to speech were also robustly activated during sign production, but the common neural architecture extended beyond the classical language areas and included extrasylvian regions in both right and left hemispheres. Furthermore, posterior perisylvian and basal temporal regions appear to play an integral role in spontaneous self-generated formulation and production of language, even in the absence of exteroceptive stimuli. Results additionally indicate that anterior and posterior areas may play distinct roles in early and late stages of language production, and suggest a novel model for lateralization of cerebral activity during the generation of discourse: progression from the early stages of lexical access to later stages of articulatory-motor encoding may constitute a progression from bilateral to left-lateralized activation. This pattern is not predicted by the standard Wernicke-Geschwind model, and may become apparent when language is produced in an ecologically valid context. (+info)
Impact of early deafness and early exposure to sign language on the cerebral organization for motion processing.
This functional magnetic resonance imaging study investigated the impact of early auditory deprivation and/or use of a visuospatial language [American sign language (ASL)] on the organization of neural systems important in visual motion processing by comparing hearing controls with deaf and hearing native signers. Participants monitored moving flowfields under different conditions of spatial and featural attention. Recruitment of the motion-selective area MT-MST in hearing controls was observed to be greater when attention was directed centrally and when the task was to detect motion features, confirming previous reports that the motion network is selectively modulated by different aspects of attention. More importantly, we observed marked differences in the recruitment of motion-related areas as a function of early experience. First, the lateralization of MT-MST was found to shift toward the left hemisphere in early signers, suggesting that early exposure to ASL leads to a greater reliance on the left MT-MST. Second, whereas the two hearing populations displayed more MT-MST activation under central than peripheral attention, the opposite pattern was observed in deaf signers, indicating enhanced recruitment of MT-MST during peripheral attention after early deafness. Third, deaf signers, but neither of the hearing populations, displayed increased activation of the posterior parietal cortex, supporting the view that parietal functions are modified after early auditory deprivation. Finally, only in deaf signers did attention to motion result in enhanced recruitment of the posterior superior temporal sulcus, establishing for the first time in humans that this polymodal area is modified after early sensory deprivation. Together these results highlight the functional and regional specificity of neuroplasticity in humans. (+info)
Signing and lexical development in children with Down syndrome.
Language development in children with Down syndrome is delayed, on average, relative to general cognitive, motor and social development, and there is also evidence for specific delays in morphology and syntax, with many adults showing persistent problems in these areas. It appears that the combined use of signed and spoken input can boost early language development significantly, this evidence coming initially from single case-studies, and more recently from larger scale controlled studies. Research with typically developing hearing and deaf children, as well as children with Down syndrome, has demonstrated the importance of establishing joint attention for vocabulary development. Furthermore, studies carried out with children with Down syndrome indicate that reducing attentional demands may be especially important in scaffolding language development in this group. The use of signing strategies which have been found to facilitate language development in deaf children when signing to children with Down syndrome is discussed, as is the need for further research on this topic and on the importance of joint attention for the use of other augmentative and alternative communication systems, such as graphic symbol and picture systems. (+info)
Neural systems underlying British Sign Language and audio-visual English processing in native users.
In order to understand the evolution of human language, it is necessary to explore the neural systems that support language processing in its many forms. In particular, it is informative to separate those mechanisms that may have evolved for sensory processing (hearing) from those that have evolved to represent events and actions symbolically (language). To what extent are the brain systems that support language processing shaped by auditory experience and to what extent by exposure to language, which may not necessarily be acoustically structured? In this first neuroimaging study of the perception of British Sign Language (BSL), we explored these questions by measuring brain activation using functional MRI in nine hearing and nine congenitally deaf native users of BSL while they performed a BSL sentence-acceptability task. Eight hearing, non-signing subjects performed an analogous task that involved audio-visual English sentences. The data support the argument that there are both modality-independent and modality-dependent language localization patterns in native users. In relation to modality-independent patterns, regions activated by both BSL in deaf signers and by spoken English in hearing non-signers included inferior prefrontal regions bilaterally (including Broca's area) and superior temporal regions bilaterally (including Wernicke's area). Lateralization patterns were similar for the two languages. There was no evidence of enhanced right-hemisphere recruitment for BSL processing in comparison with audio-visual English. In relation to modality-specific patterns, audio-visual speech in hearing subjects generated greater activation in the primary and secondary auditory cortices than BSL in deaf signers, whereas BSL generated enhanced activation in the posterior occipito-temporal regions (V5), reflecting the greater movement component of BSL. The influence of hearing status on the recruitment of sign language processing systems was explored by comparing deaf and hearing adults who had BSL as their first language (native signers). Deaf native signers demonstrated greater activation in the left superior temporal gyrus in response to BSL than hearing native signers. This important finding suggests that left- temporal auditory regions may be privileged for processing heard speech even in hearing native signers. However, in the absence of auditory input this region can be recruited for visual processing. (+info)
Coarticulation in fluent fingerspelling.
In speech, the phenomenon of coarticulation (differentiation of phoneme production depending on the preceding or following phonemes) suggests an organization of movement sequences that is not strictly serial. In the skeletal motor system, however, evidence for comparable fluency has been lacking. Thus the present study was designed to quantify coarticulation in the hand movement sequences of sign language interpreters engaged in fingerspelling. Records of 17 measured joint angles were subjected to discriminant and correlation analyses to determine to what extent and in what manner the hand shape for a particular letter was influenced by the hand shapes for the preceding or the following letters. Substantial evidence of coarticulation was found, revealing both forward and reverse influences across letters. These influences could be further categorized as assimilation (tending to reduce the differences between sequential hand shapes) or dissimilation (tending to emphasize the differences between sequential hand shapes). The proximal interphalangeal (PIP) joints of the index and middle fingers tended to show dissimilation, whereas at the same time (i.e., during the spelling of the same letters) the joints of the wrist and thumb tended to show assimilation. The index and middle finger PIP joints have been shown previously to be among the most important joints for computer recognition of the 26 letter shapes, and therefore the dissimilation may have served to enhance visual discrimination. The simultaneous occurrence of dissimilation in some joints and assimilation in others demonstrates an unprecedented level of parallel control of individual joint rotations in an essentially serial task. (+info)