Speech Perception: The process whereby an utterance is decoded into a representation in terms of linguistic units (sequences of phonetic segments which combine to form lexical and grammatical morphemes).Speech: Communication through a system of conventional vocal symbols.Speech Intelligibility: Ability to make speech sounds that are recognizable.Speech Acoustics: The acoustic aspects of speech in terms of frequency, intensity, and time.Cochlear Implants: Electronic hearing devices typically used for patients with normal outer and middle ear function, but defective inner ear function. In the COCHLEA, the hair cells (HAIR CELLS, VESTIBULAR) may be absent or damaged but there are residual nerve fibers. The device electrically stimulates the COCHLEAR NERVE to create sound sensation.Phonetics: The science or study of speech sounds and their production, transmission, and reception, and their analysis, classification, and transcription. (Random House Unabridged Dictionary, 2d ed)Speech Discrimination Tests: Tests of the ability to hear and understand speech as determined by scoring the number of words in a word list repeated correctly.Cochlear Implantation: Surgical insertion of an electronic hearing device (COCHLEAR IMPLANTS) with electrodes to the COCHLEAR NERVE in the inner ear to create sound sensation in patients with residual nerve fibers.Speech Production Measurement: Measurement of parameters of the speech product such as vocal tone, loudness, pitch, voice quality, articulation, resonance, phonation, phonetic structure and prosody.Audiometry, Speech: Measurement of the ability to hear speech under various conditions of intensity and noise interference using sound-field as well as earphones and bone oscillators.Speech Disorders: Acquired or developmental conditions marked by an impaired ability to comprehend or generate spoken forms of language.Noise: Any sound which is unwanted or interferes with HEARING other sounds.Speech Reception Threshold Test: A test to determine the lowest sound intensity level at which fifty percent or more of the spondaic test words (words of two syllables having equal stress) are repeated correctly.Acoustic Stimulation: Use of sound to elicit a response in the nervous system.Perception: The process by which the nature and meaning of sensory stimuli are recognized and interpreted.Deafness: A general term for the complete loss of the ability to hear from both ears.Auditory Perception: The process whereby auditory stimuli are selected, organized, and interpreted by the organism.Speech Therapy: Treatment for individuals with speech defects and disorders that involves counseling and use of various exercises and aids to help the development of new speech habits.Sound Spectrography: The graphic registration of the frequency and intensity of sounds, such as speech, infant crying, and animal vocalizations.Lipreading: The process by which an observer comprehends speech by watching the movements of the speaker's lips without hearing the speaker's voice.Hearing Aids: Wearable sound-amplifying devices that are intended to compensate for impaired hearing. These generic devices include air-conduction hearing aids and bone-conduction hearing aids. (UMDNS, 1999)Auditory Threshold: The audibility limit of discriminating sound intensity and pitch.Language Development: The gradual expansion in complexity and meaning of symbols and sounds as perceived and interpreted by the individual through a maturational and learning process. Stages in development include babbling, cooing, word imitation with cognition, and use of short sentences.Persons With Hearing Impairments: Persons with any degree of loss of hearing that has an impact on their activities of daily living or that requires special assistance or intervention.Psychoacoustics: The science pertaining to the interrelationship of psychologic phenomena and the individual's response to the physical properties of sound.Hearing: The ability or act of sensing and transducing ACOUSTIC STIMULATION to the CENTRAL NERVOUS SYSTEM. It is also called audition.Correction of Hearing Impairment: Procedures for correcting HEARING DISORDERS.Pitch Perception: A dimension of auditory sensation varying with cycles per second of the sound stimulus.Perceptual Masking: The interference of one perceptual stimulus with another causing a decrease or lessening in perceptual effectiveness.Visual Perception: The selecting and organizing of visual stimuli based on the individual's past experience.Voice: The sounds produced by humans by the passage of air through the LARYNX and over the VOCAL CORDS, and then modified by the resonance organs, the NASOPHARYNX, and the MOUTH.Auditory Cortex: The region of the cerebral cortex that receives the auditory radiation from the MEDIAL GENICULATE BODY.Linguistics: The science of language, including phonetics, phonology, morphology, syntax, semantics, pragmatics, and historical linguistics. (Random House Unabridged Dictionary, 2d ed)Speech Articulation Tests: Tests of accuracy in pronouncing speech sounds, e.g., Iowa Pressure Articulation Test, Deep Test of Articulation, Templin-Darley Tests of Articulation, Goldman-Fristoe Test of Articulation, Screening Speech Articulation Test, Arizona Articulation Proficiency Scale.Hearing Loss, Sensorineural: Hearing loss resulting from damage to the COCHLEA and the sensorineural elements which lie internally beyond the oval and round windows. These elements include the AUDITORY NERVE and its connections in the BRAINSTEM.Language: A verbal or nonverbal means of communicating ideas or feelings.Vocabulary: The sum or the stock of words used by a language, a group, or an individual. (From Webster, 3d ed)Psycholinguistics: A discipline concerned with relations between messages and the characteristics of individuals who select and interpret them; it deals directly with the processes of encoding (phonetics) and decoding (psychoacoustics) as they relate states of messages to states of communicators.Hearing Loss: A general term for the complete or partial loss of the ability to hear from one or both ears.Evoked Potentials, Auditory: The electric response evoked in the CEREBRAL CORTEX by ACOUSTIC STIMULATION or stimulation of the AUDITORY PATHWAYS.Audiometry, Pure-Tone: Measurement of hearing based on the use of pure tones of various frequencies and intensities as auditory stimuli.Acoustics: The branch of physics that deals with sound and sound waves. In medicine it is often applied in procedures in speech and hearing studies. With regard to the environment, it refers to the characteristics of a room, auditorium, theatre, building, etc. that determines the audibility or fidelity of sounds in it. (From Random House Unabridged Dictionary, 2d ed)Hearing Loss, Central: Hearing loss due to disease of the AUDITORY PATHWAYS (in the CENTRAL NERVOUS SYSTEM) which originate in the COCHLEAR NUCLEI of the PONS and then ascend bilaterally to the MIDBRAIN, the THALAMUS, and then the AUDITORY CORTEX in the TEMPORAL LOBE. Bilateral lesions of the auditory pathways are usually required to cause central hearing loss. Cortical deafness refers to loss of hearing due to bilateral auditory cortex lesions. Unilateral BRAIN STEM lesions involving the cochlear nuclei may result in unilateral hearing loss.Hearing Loss, Bilateral: Partial hearing loss in both ears.Hearing Tests: Part of an ear examination that measures the ability of sound to reach the brain.Gestures: Movement of a part of the body for the purpose of communication.Audiometry: The testing of the acuity of the sense of hearing to determine the thresholds of the lowest intensity levels at which an individual can hear a set of tones. The frequencies between 125 and 8000 Hz are used to test air conduction thresholds and the frequencies between 250 and 4000 Hz are used to test bone conduction thresholds.Child Language: The language and sounds expressed by a child at a particular maturational stage in development.Voice Quality: That component of SPEECH which gives the primary distinction to a given speaker's VOICE when pitch and loudness are excluded. It involves both phonatory and resonatory characteristics. Some of the descriptions of voice quality are harshness, breathiness and nasality.Comprehension: The act or fact of grasping the meaning, nature, or importance of; understanding. (American Heritage Dictionary, 4th ed) Includes understanding by a patient or research subject of information disclosed orally or in writing.Language Tests: Tests designed to assess language behavior and abilities. They include tests of vocabulary, comprehension, grammar and functional use of language, e.g., Development Sentence Scoring, Receptive-Expressive Emergent Language Scale, Parsons Language Sample, Utah Test of Language Development, Michigan Language Inventory and Verbal Language Development Scale, Illinois Test of Psycholinguistic Abilities, Northwestern Syntax Screening Test, Peabody Picture Vocabulary Test, Ammons Full-Range Picture Vocabulary Test, and Assessment of Children's Language Comprehension.Music: Sound that expresses emotion through rhythm, melody, and harmony.Speech Recognition Software: Software capable of recognizing dictation and transcribing the spoken words into written text.Cues: Signals for an action; that specific portion of a perceptual field or pattern of stimuli to which a subject has learned to respond.Auditory Pathways: NEURAL PATHWAYS and connections within the CENTRAL NERVOUS SYSTEM, beginning at the hair cells of the ORGAN OF CORTI, continuing along the eighth cranial nerve, and terminating at the AUDITORY CORTEX.Diagnostic Techniques, Otological: Methods and procedures for the diagnosis of diseases of the ear or of hearing disorders or demonstration of hearing acuity or loss.Multilingualism: The ability to speak, read, or write several languages or many languages with some facility. Bilingualism is the most common form. (From Random House Unabridged Dictionary, 2d ed)Auditory Perceptual Disorders: Acquired or developmental cognitive disorders of AUDITORY PERCEPTION characterized by a reduced ability to perceive information contained in auditory stimuli despite intact auditory pathways. Affected individuals have difficulty with speech perception, sound localization, and comprehending the meaning of inflections of speech.ReadingHearing Disorders: Conditions that impair the transmission of auditory impulses and information from the level of the ear to the temporal cortices, including the sensorineural pathways.Lip: Either of the two fleshy, full-blooded margins of the mouth.Semantics: The relationships between symbols and their meanings.Aphasia, Broca: An aphasia characterized by impairment of expressive LANGUAGE (speech, writing, signs) and relative preservation of receptive language abilities (i.e., comprehension). This condition is caused by lesions of the motor association cortex in the FRONTAL LOBE (BROCA AREA and adjacent cortical and white matter regions).Pattern Recognition, Physiological: The analysis of a critical number of sensory stimuli or facts (the pattern) by physiological processes such as vision (PATTERN RECOGNITION, VISUAL), touch, or hearing.Time Perception: The ability to estimate periods of time lapsed or duration of time.Language Development Disorders: Conditions characterized by language abilities (comprehension and expression of speech and writing) that are below the expected level for a given age, generally in the absence of an intellectual impairment. These conditions may be associated with DEAFNESS; BRAIN DISEASES; MENTAL DISORDERS; or environmental factors.Functional Laterality: Behavioral manifestations of cerebral dominance in which there is preferential use and superior functioning of either the left or the right side, as in the preferred use of the right hand or right foot.Phonation: The process of producing vocal sounds by means of VOCAL CORDS vibrating in an expiratory blast of air.Dyslexia: A cognitive disorder characterized by an impaired ability to comprehend written and printed words or phrases despite intact vision. This condition may be developmental or acquired. Developmental dyslexia is marked by reading achievement that falls substantially below that expected given the individual's chronological age, measured intelligence, and age-appropriate education. The disturbance in reading significantly interferes with academic achievement or with activities of daily living that require reading skills. (From DSM-IV)Recognition (Psychology): The knowledge or perception that someone or something present has been previously encountered.Brain Mapping: Imaging techniques used to colocalize sites of brain functions or physiological activity with brain structures.Pitch Discrimination: The ability to differentiate tones.Motion Perception: The real or apparent movement of objects through the visual field.Social Perception: The perceiving of attributes, characteristics, and behaviors of one's associates or social groups.Dysarthria: Disorders of speech articulation caused by imperfect coordination of pharynx, larynx, tongue, or face muscles. This may result from CRANIAL NERVE DISEASES; NEUROMUSCULAR DISEASES; CEREBELLAR DISEASES; BASAL GANGLIA DISEASES; BRAIN STEM diseases; or diseases of the corticobulbar tracts (see PYRAMIDAL TRACTS). The cortical language centers are intact in this condition. (From Adams et al., Principles of Neurology, 6th ed, p489)Bionics: The study of systems, particularly electronic systems, which function after the manner of, in a manner characteristic of, or resembling living systems. Also, the science of applying biological techniques and principles to the design of electronic systems.Speech, Esophageal: A method of speech used after laryngectomy, with sound produced by vibration of the column of air in the esophagus against the contracting cricopharyngeal sphincter. (Dorland, 27th ed)Temporal Lobe: Lower lateral part of the cerebral hemisphere responsible for auditory, olfactory, and semantic processing. It is located inferior to the lateral fissure and anterior to the OCCIPITAL LOBE.Loudness Perception: The perceived attribute of a sound which corresponds to the physical attribute of intensity.Cochlear Nerve: The cochlear part of the 8th cranial nerve (VESTIBULOCOCHLEAR NERVE). The cochlear nerve fibers originate from neurons of the SPIRAL GANGLION and project peripherally to cochlear hair cells and centrally to the cochlear nuclei (COCHLEAR NUCLEUS) of the BRAIN STEM. They mediate the sense of hearing.Discrimination (Psychology): Differential response to different stimuli.Magnetic Resonance Imaging: Non-invasive method of demonstrating internal anatomy based on the principle that atomic nuclei in a strong magnetic field absorb pulses of radiofrequency energy and emit them as radiowaves which can be reconstructed into computerized images. The concept includes proton spin tomographic techniques.Critical Period (Psychology): A specific stage in animal and human development during which certain types of behavior normally are shaped and molded for life.Photic Stimulation: Investigative technique commonly used during ELECTROENCEPHALOGRAPHY in which a series of bright light flashes or visual patterns are used to elicit brain activity.Signal Processing, Computer-Assisted: Computer-assisted processing of electric, ultrasonic, or electronic signals to interpret function and activity.Magnetoencephalography: The measurement of magnetic fields over the head generated by electric currents in the brain. As in any electrical conductor, electric fields in the brain are accompanied by orthogonal magnetic fields. The measurement of these fields provides information about the localization of brain activity which is complementary to that provided by ELECTROENCEPHALOGRAPHY. Magnetoencephalography may be used alone or together with electroencephalography, for measurement of spontaneous or evoked activity, and for research or clinical purposes.Aphasia: A cognitive disorder marked by an impaired ability to comprehend or express language in its written or spoken form. This condition is caused by diseases which affect the language areas of the dominant hemisphere. Clinical features are used to classify the various subtypes of this condition. General categories include receptive, expressive, and mixed forms of aphasia.Stuttering: A disturbance in the normal fluency and time patterning of speech that is inappropriate for the individual's age. This disturbance is characterized by frequent repetitions or prolongations of sounds or syllables. Various other types of speech dysfluencies may also be involved including interjections, broken words, audible or silent blocking, circumlocutions, words produced with an excess of physical tension, and monosyllabic whole word repetitions. Stuttering may occur as a developmental condition in childhood or as an acquired disorder which may be associated with BRAIN INFARCTIONS and other BRAIN DISEASES. (From DSM-IV, 1994)Speech, Alaryngeal: Methods of enabling a patient without a larynx or with a non-functional larynx to produce voice or speech. The methods may be pneumatic or electronic.Sound Localization: Ability to determine the specific location of a sound source.Articulation Disorders: Disorders of the quality of speech characterized by the substitution, omission, distortion, and addition of phonemes.Analysis of Variance: A statistical technique that isolates and assesses the contributions of categorical independent variables to variation in the mean of a continuous dependent variable.Reaction Time: The time from the onset of a stimulus until a response is observed.Depth Perception: Perception of three-dimensionality.Signal Detection, Psychological: Psychophysical technique that permits the estimation of the bias of the observer as well as detectability of the signal (i.e., stimulus) in any sensory modality. (From APA, Thesaurus of Psychological Index Terms, 8th ed.)Auditory Brain Stem Implants: Multi-channel hearing devices typically used for patients who have tumors on the COCHLEAR NERVE and are unable to benefit from COCHLEAR IMPLANTS after tumor surgery that severs the cochlear nerve. The device electrically stimulates the nerves of cochlea nucleus in the BRAIN STEM rather than the inner ear as in cochlear implants.Apraxias: A group of cognitive disorders characterized by the inability to perform previously learned skills that cannot be attributed to deficits of motor or sensory function. The two major subtypes of this condition are ideomotor (see APRAXIA, IDEOMOTOR) and ideational apraxia, which refers to loss of the ability to mentally formulate the processes involved with performing an action. For example, dressing apraxia may result from an inability to mentally formulate the act of placing clothes on the body. Apraxias are generally associated with lesions of the dominant PARIETAL LOBE and supramarginal gyrus. (From Adams et al., Principles of Neurology, 6th ed, pp56-7)Communication Aids for Disabled: Equipment that provides mentally or physically disabled persons with a means of communication. The aids include display boards, typewriters, cathode ray tubes, computers, and speech synthesizers. The output of such aids includes written words, artificial speech, language signs, Morse code, and pictures.Learning: Relatively permanent change in behavior that is the result of past experience or practice. The concept includes the acquisition of knowledge.Evoked Potentials, Auditory, Brain Stem: Electrical waves in the CEREBRAL CORTEX generated by BRAIN STEM structures in response to auditory click stimuli. These are found to be abnormal in many patients with CEREBELLOPONTINE ANGLE lesions, MULTIPLE SCLEROSIS, or other DEMYELINATING DISEASES.Attention: Focusing on certain aspects of current experience to the exclusion of others. It is the act of heeding or taking notice or concentrating.Time Factors: Elements of limited time intervals, contributing to particular results or situations.Electroencephalography: Recording of electric currents developed in the brain by means of electrodes applied to the scalp, to the surface of the brain, or placed within the substance of the brain.Form Perception: The sensory discrimination of a pattern shape or outline.Sound: A type of non-ionizing radiation in which energy is transmitted through solid, liquid, or gas as compression waves. Sound (acoustic or sonic) radiation with frequencies above the audible range is classified as ultrasonic. Sound radiation below the audible range is classified as infrasonic.Frontal Lobe: The part of the cerebral hemisphere anterior to the central sulcus, and anterior and superior to the lateral sulcus.Verbal Behavior: Includes both producing and responding to words, either written or spoken.Neurobiology: The study of the structure, growth, activities, and functions of NEURONS and the NERVOUS SYSTEM.Pain Perception: The process by which PAIN is recognized and interpreted by the brain.Child Development: The continuous sequential physiological and psychological maturing of an individual from birth up to but not including ADOLESCENCE.Verbal Learning: Learning to respond verbally to a verbal stimulus cue.Cognition: Intellectual or mental process whereby an organism obtains knowledge.Brain: The part of CENTRAL NERVOUS SYSTEM that is contained within the skull (CRANIUM). Arising from the NEURAL TUBE, the embryonic brain is comprised of three major parts including PROSENCEPHALON (the forebrain); MESENCEPHALON (the midbrain); and RHOMBENCEPHALON (the hindbrain). The developed brain consists of CEREBRUM; CEREBELLUM; and other structures in the BRAIN STEM.Psychomotor Performance: The coordination of a sensory or ideational (cognitive) process and a motor activity.Prosthesis Design: The plan and delineation of prostheses in general or a specific prosthesis.Touch Perception: The process by which the nature and meaning of tactile stimuli are recognized and interpreted by the brain, such as realizing the characteristics or name of an object being touched.Space Perception: The awareness of the spatial properties of objects; includes physical space.Behavior: The observable response of a man or animal to a situation.Neuropsychological Tests: Tests designed to assess neurological function associated with certain behaviors. They are used in diagnosing brain dysfunction or damage and central nervous system disorders or injury.Image Processing, Computer-Assisted: A technique of inputting two-dimensional images into a computer and then enhancing or analyzing the imagery into a form that is more useful to the human observer.Taste Perception: The process by which the nature and meaning of gustatory stimuli are recognized and interpreted by the brain. The four basic classes of taste perception are salty, sweet, bitter, and sour.Cerebral Cortex: The thin layer of GRAY MATTER on the surface of the CEREBRAL HEMISPHERES that develops from the TELENCEPHALON and folds into gyri and sulchi. It reaches its highest development in humans and is responsible for intellectual faculties and higher mental functions.Speech-Language Pathology: The study of speech or language disorders and their diagnosis and correction.Language Disorders: Conditions characterized by deficiencies of comprehension or expression of written and spoken forms of language. These include acquired and developmental disorders.Age Factors: Age as a constituent element or influence contributing to the production of a result. It may be applicable to the cause or the effect of a circumstance. It is used with human or animal concepts but should be differentiated from AGING, a physiological process, and TIME FACTORS which refers only to the passage of time.Illusions: The misinterpretation of a real external, sensory experience.Size Perception: The sensory interpretation of the dimensions of objects.Questionnaires: Predetermined sets of questions used to collect data - clinical data, social status, occupational group, etc. The term is often applied to a self-completed survey instrument.Evoked Potentials: Electrical responses recorded from nerve, muscle, SENSORY RECEPTOR, or area of the CENTRAL NERVOUS SYSTEM following stimulation. They range from less than a microvolt to several microvolts. The evoked potential can be auditory (EVOKED POTENTIALS, AUDITORY), somatosensory (EVOKED POTENTIALS, SOMATOSENSORY), visual (EVOKED POTENTIALS, VISUAL), or motor (EVOKED POTENTIALS, MOTOR), or other modalities that have been reported.Color Perception: Mental processing of chromatic signals (COLOR VISION) from the eye by the VISUAL CORTEX where they are converted into symbolic representations. Color perception involves numerous neurons, and is influenced not only by the distribution of wavelengths from the viewed object, but also by its background color and brightness contrast at its boundary.Olfactory Perception: The process by which the nature and meaning of olfactory stimuli, such as odors, are recognized and interpreted by the brain.Parietal Lobe: Upper central part of the cerebral hemisphere. It is located posterior to central sulcus, anterior to the OCCIPITAL LOBE, and superior to the TEMPORAL LOBES.Memory, Short-Term: Remembrance of information for a few seconds to hours.Voice Disorders: Pathological processes that affect voice production, usually involving VOCAL CORDS and the LARYNGEAL MUCOSA. Voice disorders can be caused by organic (anatomical), or functional (emotional or psychological) factors leading to DYSPHONIA; APHONIA; and defects in VOICE QUALITY, loudness, and pitch.Health Knowledge, Attitudes, Practice: Knowledge, attitudes, and associated behaviors which pertain to health-related topics such as PATHOLOGIC PROCESSES or diseases, their prevention, and treatment. This term refers to non-health workers and health workers (HEALTH PERSONNEL).Attitude of Health Personnel: Attitudes of personnel toward their patients, other professionals, toward the medical care system, etc.Attitude to Health: Public attitudes toward health, disease, and the medical care system.Velopharyngeal Insufficiency: Failure of the SOFT PALATE to reach the posterior pharyngeal wall to close the opening between the oral and nasal cavities. Incomplete velopharyngeal closure is primarily related to surgeries (ADENOIDECTOMY; CLEFT PALATE) or an incompetent PALATOPHARYNGEAL SPHINCTER. It is characterized by hypernasal speech.Motor Cortex: Area of the FRONTAL LOBE concerned with primary motor control located in the dorsal PRECENTRAL GYRUS immediately anterior to the central sulcus. It is comprised of three areas: the primary motor cortex located on the anterior paracentral lobule on the medial surface of the brain; the premotor cortex located anterior to the primary motor cortex; and the supplementary motor area located on the midline surface of the hemisphere anterior to the primary motor cortex.Weight Perception: Recognition and discrimination of the heaviness of a lifted object.Jaw: Bony structure of the mouth that holds the teeth. It consists of the MANDIBLE and the MAXILLA.

Language processing is strongly left lateralized in both sexes. Evidence from functional MRI. (1/2052)

Functional MRI (fMRI) was used to examine gender effects on brain activation during a language comprehension task. A large number of subjects (50 women and 50 men) was studied to maximize the statistical power to detect subtle differences between the sexes. To estimate the specificity of findings related to sex differences, parallel analyses were performed on two groups of randomly assigned subjects. Men and women showed very similar, strongly left lateralized activation patterns. Voxel-wise tests for group differences in overall activation patterns demonstrated no significant differences between women and men. In further analyses, group differences were examined by region of interest and by hemisphere. No differences were found between the sexes in lateralization of activity in any region of interest or in intrahemispheric cortical activation patterns. These data argue against substantive differences between men and women in the large-scale neural organization of language processes.  (+info)

Effects of talker, rate, and amplitude variation on recognition memory for spoken words. (2/2052)

This study investigated the encoding of the surface form of spoken words using a continuous recognition memory task. The purpose was to compare and contrast three sources of stimulus variability--talker, speaking rate, and overall amplitude--to determine the extent to which each source of variability is retained in episodic memory. In Experiment 1, listeners judged whether each word in a list of spoken words was "old" (had occurred previously in the list) or "new." Listeners were more accurate at recognizing a word as old if it was repeated by the same talker and at the same speaking rate; however, there was no recognition advantage for words repeated at the same overall amplitude. In Experiment 2, listeners were first asked to judge whether each word was old or new, as before, and then they had to explicitly judge whether it was repeated by the same talker, at the same rate, or at the same amplitude. On the first task, listeners again showed an advantage in recognition memory for words repeated by the same talker and at same speaking rate, but no advantage occurred for the amplitude condition. However, in all three conditions, listeners were able to explicitly detect whether an old word was repeated by the same talker, at the same rate, or at the same amplitude. These data suggest that although information about all three properties of spoken words is encoded and retained in memory, each source of stimulus variation differs in the extent to which it affects episodic memory for spoken words.  (+info)

Infants' learning about words and sounds in relation to objects. (3/2052)

In acquiring language, babies learn not only that people can communicate about objects and events, but also that they typically use a particular kind of act as the communicative signal. The current studies asked whether 1-year-olds' learning of names during joint attention is guided by the expectation that names will be in the form of spoken words. In the first study, 13-month-olds were introduced to either a novel word or a novel sound-producing action (using a small noisemaker). Both the word and the sound were produced by a researcher as she showed the baby a new toy during a joint attention episode. The baby's memory for the link between the word or sound and the object was tested in a multiple choice procedure. Thirteen-month-olds learned both the word-object and sound-object correspondences, as evidenced by their choosing the target reliably in response to hearing the word or sound on test trials, but not on control trials when no word or sound was present. In the second study, 13-month-olds, but not 20-month-olds, learned a new sound-object correspondence. These results indicate that infants initially accept a broad range of signals in communicative contexts and narrow the range with development.  (+info)

Isolating the contributions of familiarity and source information to item recognition: a time course analysis. (4/2052)

Recognition memory may be mediated by the retrieval of distinct types of information, notably, a general assessment of familiarity and the recovery of specific source information. A response-signal speed-accuracy trade-off variant of an exclusion procedure was used to isolate the retrieval time course for familiarity and source information. In 2 experiments, participants studied spoken and read lists (with various numbers of presentations) and then performed an exclusion task, judging an item as old only if it was in the heard list. Dual-process fits of the time course data indicated that familiarity information typically is retrieved before source information. The implications that these data have for models of recognition, including dual-process and global memory models, are discussed.  (+info)

PET imaging of cochlear-implant and normal-hearing subjects listening to speech and nonspeech. (5/2052)

Functional neuroimaging with positron emission tomography (PET) was used to compare the brain activation patterns of normal-hearing (NH) with postlingually deaf, cochlear-implant (CI) subjects listening to speech and nonspeech signals. The speech stimuli were derived from test batteries for assessing speech-perception performance of hearing-impaired subjects with different sensory aids. Subjects were scanned while passively listening to monaural (right ear) stimuli in five conditions: Silent Baseline, Word, Sentence, Time-reversed Sentence, and Multitalker Babble. Both groups showed bilateral activation in superior and middle temporal gyri to speech and backward speech. However, group differences were observed in the Sentence compared to Silence condition. CI subjects showed more activated foci in right temporal regions, where lateralized mechanisms for prosodic (pitch) processing have been well established; NH subjects showed a focus in the left inferior frontal gyrus (Brodmann's area 47), where semantic processing has been implicated. Multitalker Babble activated auditory temporal regions in the CI group only. Whereas NH listeners probably habituated to this multitalker babble, the CI listeners may be using a perceptual strategy that emphasizes 'coarse' coding to perceive this stimulus globally as speechlike. The group differences provide the first neuroimaging evidence suggesting that postlingually deaf CI and NH subjects may engage differing perceptual processing strategies under certain speech conditions.  (+info)

Regulation of parkinsonian speech volume: the effect of interlocuter distance. (6/2052)

This study examined the automatic regulation of speech volume over distance in hypophonic patients with Parkinson's disease and age and sex matched controls. There were two speech settings; conversation, and the recitation of sequential material (for example, counting). The perception of interlocuter speech volume by patients with Parkinson's disease and controls over varying distances was also examined, and found to be slightly discrepant. For speech production, it was found that controls significantly increased overall speech volume for conversation relative to that for sequential material. Patients with Parkinson's disease were unable to achieve this overall increase for conversation, and consistently spoke at a softer volume than controls at all distances (intercept reduction). However, patients were still able to increase volume for greater distances in a similar way to controls for conversation and sequential material, thus showing a normal pattern of volume regulation (slope similarity). It is suggested that speech volume regulation is intact in Parkinson's disease, but rather the gain is reduced. These findings are reminiscent of skeletal motor control studies in Parkinson's disease, in which the amplitude of movement is diminished but the relation with another factor is preserved (stride length increases as cadence-that is, stepping rate, increases).  (+info)

Specialization of left auditory cortex for speech perception in man depends on temporal coding. (7/2052)

Speech perception requires cortical mechanisms capable of analysing and encoding successive spectral (frequency) changes in the acoustic signal. To study temporal speech processing in the human auditory cortex, we recorded intracerebral evoked potentials to syllables in right and left human auditory cortices including Heschl's gyrus (HG), planum temporale (PT) and the posterior part of superior temporal gyrus (area 22). Natural voiced /ba/, /da/, /ga/) and voiceless (/pa/, /ta/, /ka/) syllables, spoken by a native French speaker, were used to study the processing of a specific temporally based acoustico-phonetic feature, the voice onset time (VOT). This acoustic feature is present in nearly all languages, and it is the VOT that provides the basis for the perceptual distinction between voiced and voiceless consonants. The present results show a lateralized processing of acoustic elements of syllables. First, processing of voiced and voiceless syllables is distinct in the left, but not in the right HG and PT. Second, only the evoked potentials in the left HG, and to a lesser extent in PT, reflect a sequential processing of the different components of the syllables. Third, we show that this acoustic temporal processing is not limited to speech sounds but applies also to non-verbal sounds mimicking the temporal structure of the syllable. Fourth, there was no difference between responses to voiced and voiceless syllables in either left or right areas 22. Our data suggest that a single mechanism in the auditory cortex, involved in general (not only speech-specific) temporal processing, may underlie the further processing of verbal (and non-verbal) stimuli. This coding, bilaterally localized in auditory cortex in animals, takes place specifically in the left HG in man. A defect of this mechanism could account for hearing discrimination impairments associated with language disorders.  (+info)

Cochlear implantations in Northern Ireland: an overview of the first five years. (8/2052)

During the last few years cochlear implantation (CI) has made remarkable progress, developing from a mere research tool to a viable clinical application. The Centre for CI in the Northern Ireland was established in 1992 and has since been a provider of this new technology for rehabilitation of profoundly deaf patients in the region. Although individual performance with a cochlear implant cannot be predicted accurately, the overall success of CI can no longer be denied. Seventy one patients, 37 adults and 34 children, have received implants over the first five years of the Northern Ireland cochlear implant programme, which is located at the Belfast City Hospital. The complication rates and the post-implantation outcome of this centre compare favourably with other major centres which undertake the procedure. This paper aims to highlight the patient selection criteria, surgery, post-CI outcome, clinical and research developments within our centre, and future prospects of this recent modality of treatment.  (+info)

  • Initially, speech perception was assumed to link to speech objects that were both the invariant movements of speech articulators the invariant motor commands sent to muscles to move the vocal tract articulators This was later revised to include the phonetic gestures rather than motor commands, and then the gestures intended by the speaker at a prevocal, linguistic level, rather than actual movements. (wikipedia.org)
  • We investigated the consequences of monitoring an asynchronous audiovisual speech stream on the temporal perception of simultaneously presented vowel-consonant-vowel (VCV) audiovisual speech video clips. (ox.ac.uk)
  • Initially, the theory was associationist: infants mimic the speech they hear and that this leads to behavioristic associations between articulation and its sensory consequences. (wikipedia.org)
  • Using a speech synthesizer, speech sounds can be varied in place of articulation along a continuum from /bɑ/ to /dɑ/ to /ɡɑ/, or in voice onset time on a continuum from /dɑ/ to /tɑ/ (for example). (wikipedia.org)
  • To summarize, several cases show significant differences between the speech perception of maskers in movement and stationary maskers. (rwth-aachen.de)
  • Therefore, speech-in-noise tests measure the hearing impairment in complex scenes and are an integral part of the audiological assessment. (rwth-aachen.de)
  • This result suggests that the consequences of adapting to asynchronous speech extends beyond the case of simple audiovisual stimuli (as has recently been demonstrated by Navarra et al. (ox.ac.uk)
  • in Cogn Brain Res 25:499-507, 2005) and can even affect the perception of more complex speech stimuli. (ox.ac.uk)
  • This aspect of the theory was dropped, however, with the discovery that prelinguistic infants could already detect most of the phonetic contrasts used to separate different speech sounds. (wikipedia.org)
  • The "speech is special" claim has been dropped, as it was found that speech perception could occur for nonspeech sounds (for example, slamming doors for duplex perception). (wikipedia.org)
  • Though the idea of a module has been qualified in more recent versions of the theory, the idea remains that the role of the speech motor system is not only to produce speech articulations but also to detect them. (wikipedia.org)
  • In the first part of the thesis, the speech perception with a masker moving both away from the target position and toward the target position was analyzed. (rwth-aachen.de)
  • Participants made temporal order judgments (TOJs) regarding whether the speech-sound or the visual-speech gesture occurred first, for video clips presented at various different stimulus onset asynchronies. (ox.ac.uk)
  • At the same time, the benefit of the spatial separation between speech target and masker(s), known as spatial release from masking (SRM), was largely investigated. (rwth-aachen.de)
  • It is a challenging task for researchers to determine how the brain solves multisensory perception, and the neural mechanisms involved remain subject to theoretical conjecture. (diva-portal.org)
  • Univariate and multivariate analyses were performed to isolate the neural correlates of the word- and pitch-based discrimination between song and speech, corrected for rhythmic differences in both. (frontiersin.org)
  • Previous research has demonstrated that in quiet acoustic conditions auditory-visual speech perception occurs faster (decreased latency) and with less neural activity (decreased amplitude) than auditory-only speech perception. (illinois.edu)
  • Sensory neuroscience studies the neural mechanisms underlying perception. (wikipedia.org)
  • Single-trial fMRI blood oxygenation level-dependent (BOLD) responses from perception periods were analyzed using multivariate pattern classification and a searchlight approach to reveal neural activation patterns sensitive to the processing of place of articulation (i.e., bilabial/labiodental vs. alveolar). (eneuro.org)
  • Although much effort has been directed toward understanding the neural basis of speech processing, the neural processes involved in the categorical perception of speech have been relatively less studied, and many questions remain open. (jneurosci.org)
  • We hypothesized that frontal articulation areas are involved in categorical speech perception, but that they may be invisible to subtraction-based fMRI analysis if complex articulatory gestures are represented not by different levels of activity within single voxels, but by differential neural activity patterns within a region of cortex. (jneurosci.org)
  • Researchers of the MRG aim to understand the neural and cognitive basis of human perception and attention processes in multisensory environments. (upf.edu)
  • In general there were no group differences across the training tasks, although the US group showed greater improvement that the BSUT and BS groups on vowel perception. (edu.au)
  • Her research focuses on speech perception and modeling vowel perception. (asha.org)
  • By carefully evaluating speech perception in a variety of test conditions, we can determine what phonemes are not clear and can make some adjusting of technology settings to improve speech perception. (hearinghealthmatters.org)
  • Deactivating cochlear implant electrodes to improve speech perception: A computational approach. (slesystems.com)
  • We were this 16-year-old book speech to our linguistics and that made in a book of 10 l noses( diverse, Religion, frequent, musical, problem, investment collapse request, British export, request, sample, damage extension structure) to File freedom unofficial path address. (staralvarezthomas.com.ar)
  • Using the phenomenon of lexical retuning in speech processing, we ask whether those units are necessarily phonemic. (mpg.de)
  • Here, it is proposed that the earliest developing sensory system - likely somatosensory in the case of speech, including somatosensory feedback from oral-motor movements that are first manifest in the fetus, provides an organization on which auditory speech can build once the peripheral auditory system comes on-line by 22 weeks gestation. (grantome.com)
  • Perception (from the Latin perceptio ) is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information, or the environment. (wikipedia.org)
  • Our findings demonstrate changes in the functional asymmetry of cortical speech processing during adverse acoustic conditions and suggest that "cocktail party" listening skills depend on the quality of speech representations in the left cerebral hemisphere rather than compensatory recruitment of right hemisphere mechanisms. (nih.gov)
  • 2) processing which is connected with a person's concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention ) that influence perception. (wikipedia.org)
  • This functional magnetic resonance imaging study examines shared and distinct cortical areas involved in the auditory perception of song and speech at the level of their underlying constituents: words and pitch patterns. (frontiersin.org)
  • The difference in results between univariate and multivariate pattern-based analyses of the same data suggest that processes in different cortical areas along the dorsal speech perception stream are distributed on different spatial scales. (jneurosci.org)
  • Children affected by dyslexia exhibit a deficit in the categorical perception of speech sounds, characterized by both poorer discrimination of between-category differences and by better discrimination of within-category differences, compared to normal readers. (ed.gov)
  • These categorical perception anomalies might be at the origin of dyslexia, by hampering the set up of grapheme-phoneme correspondences, but they might also be the consequence of poor reading skills, as literacy probably contributes to stabilizing phonological categories. (ed.gov)
  • https://jslhr.pubs.asha.org/article.aspx?articleid=1779113 Speech Perception in Adult Subjects With Familial Dyslexia Speech perception was investigated in a carefully selected group of adult subjects with familial dyslexia. (asha.org)
  • Speech perception was investigated in a carefully selected group of adult subjects with familial dyslexia. (asha.org)
  • The SFTV Blog download perception in multimodal dialogue systems 4th ieee of Natural Products Data: electricity These assert the endless passions that north the system of Organic Chemistry. (sftv.org)
  • Shall the download perception in multimodal dialogue systems 4th ieee tutorial and research workshop on perception and interactive technologies for speech based of soul enter me are, that I have so proper of an principle, which is not been by any numerous perfection? (sftv.org)
  • d from them, and sufficiently respecting it in the download perception in multimodal dialogue systems 4th ieee tutorial and research they had, was not all the observing cognitionis, without any other individual or form, in implies their first malum Latinist at remote rise them to what they are for. (sftv.org)
  • There denies a non download perception in multimodal dialogue systems 4th ieee tutorial and research workshop on perception and interactive technologies for speech based systems of long molecules universal of revolving spiritual geological effects, because they function Now opposed become by natural objects and feel then entitled on abusing of present numbers. (sftv.org)
  • It has chimerical that the made have download perception in multimodal dialogue systems 4th ieee tutorial and research workshop on perception during implying 65MB manners for the operation of our vain-glory. (sftv.org)
  • however on the download perception in multimodal, what understands and perceives them is other. (sftv.org)
  • These pretensions observe an download perception in multimodal dialogue systems 4th on the corruption as so even on the genera. (sftv.org)
  • In the adult, speech perception is richly multimodal. (grantome.com)
  • Importance: Children with a history of amblyopia, even if resolved, exhibit impaired visual-auditory integration and perceive speech differently. (luriechildrens.org)
  • Subtitles in one's native language, the default in some European countries, are harmful to learning to understand foreign speech. (innovations-report.com)
  • Reliable constant relations between a phoneme of a language and its acoustic manifestation in speech are difficult to find. (wikipedia.org)
  • Our research focuses on the study of language learning, its perception, and issues related to language processing in general (with a special emphasis on bilingual populations). (upf.edu)
  • As they are 6 months old, they are introduced to statistical learning (distributional frequencies) and they have preference to language-specific perception for vowels. (wikiversity.org)
  • If she cannot hear soft speech she will not be able to overhear conversation, which will significantly reduce language exposure. (hearinghealthmatters.org)
  • A large amount of research studies focus on how users of a language perceive foreign speech (referred to as cross-language speech perception) or ________ speech (second-language speech perception). (thefullwiki.org)
  • As the Deputy Director of the ASC, Dr Grant has direct supervisory and mission planning responsibilities for the largest Audiology and Speech-Language-Pathology clinic in the DoD. (jhu.edu)
  • Within one's 1st language however adjustments in conversation perception appear to possess a negligible effect on conversation creation (Kraljic et al. (exposed-skin-care.net)
  • Evidence has shown that subtle implicit information of a speaker's characteristics or social identity inferred by the listener can influence how language varieties are perceived, and can cause significant effects on the result of speech perception (e.g. (rice.edu)
  • Is Language a Factor in the Perception of Foreign Accent Syndrome? (semanticscholar.org)
  • Motor structures are not necessary for speech perception under normal conditions (Greg has argued this convincingly) but the RT data from this paper suggests to me that they maybe of some use sometimes in speech perception, at least under conditions of uncertainty - e.g. in noise (as in this paper) or perhaps during recovery of language function after stroke. (talkingbrains.org)
  • Celia Hooper, ASHA vice president for professional practices in speech-language pathology (2003-2005), and Brian Shulman, ASHA vice president for professional practices in speech-language pathology (2006-2008), served as the monitoring officers. (asha.org)
  • The goal of this technical report on childhood apraxia of speech (CAS) was to assemble information about this challenging disorder that would be useful for caregivers, speech-language pathologists, and a variety of other health care professionals. (asha.org)
  • All children had full insertions of the electrode array without surgical complications and are developing age-appropriate auditory perception and oral language skills. (aappublications.org)
  • Characteristically, pediatric cochlear implant recipients already have significant language and speech delays at the time of implantation considering that, historically, the majority of children were receiving implants at age 2 years and older. (aappublications.org)
  • The results revealed that the visual speech stream had to lead the auditory speech stream by a significantly larger interval in the participants' native language than in the non-native language for simultaneity to be perceived. (ox.ac.uk)
  • Perception of language forms (consonants, vowels, word forms) can be direct if the forms lawfully cause specifying patterning in the energy arrays available to perceivers. (oxfordre.com)
  • Speech is human vocal communication using language . (wikipedia.org)
  • While animals also communicate using vocalizations, and trained apes such as Washoe and Kanzi can use simple sign language , no animals' vocalizations are articulated phonemically and syntactically, and do not constitute speech. (wikipedia.org)
  • re using the VIP book speech language and communication handbook of perception and! (staralvarezthomas.com.ar)
  • The book speech language and communication Click air you'll face per analysis for your Power target. (staralvarezthomas.com.ar)
  • The book speech language and communication handbook of perception of Dianetics and Scientology, Lecture 18( Speech). (staralvarezthomas.com.ar)
  • not, the relevant book speech language and communication between local and basic magazines does their Misc or FDI of research. (staralvarezthomas.com.ar)
  • explored Under: book speech language About the Author: HasaHasa does a BA request in the literature of effects and is also clamping a Master's correspondence in the book of quiet investment and vertices. (staralvarezthomas.com.ar)
  • The present combined anatomo-functional case study, for the first time, demonstrated that aSTS/STG in the language dominant hemisphere actively engages in speech perception. (kyoto-u.ac.jp)
  • We are interested in how the motor and auditory cortex interact during speech perception. (ox.ac.uk)
  • Similar findings have been shown for auditory-only speech inputs for signals composed of disjoint and non-overlapping spectral bands where over 90% of the spectral information has been discarded. (jhu.edu)
  • Paper 2 reviews research that has used the sine wave speech paradigm in studies of speech perception. (diva-portal.org)
  • Domain General Change Detection Accounts for 'Dishabituation' Effects in Temporal-Parietal Regions in Functional Magnetic Resonance Imaging Studies of Speech Perception. (mendeley.com)
  • Normal human speech is pulmonic, produced with pressure from the lungs , which creates phonation in the glottis in the larynx , which is then modified by the vocal tract and mouth into different vowels and consonants . (wikipedia.org)
Speech perception : Quiz (The Full Wiki)
Speech perception : Quiz (The Full Wiki) (quiz.thefullwiki.org)
Cortical Activity Predicts Which Older Adults Recognize Speech in Noise and When | Journal of Neuroscience
Cortical Activity Predicts Which Older Adults Recognize Speech in Noise and When | Journal of Neuroscience (jneurosci.org)
Foreign Subtitles Help but Native-Language Subtitles Harm Foreign Speech Perception
Foreign Subtitles Help but Native-Language Subtitles Harm Foreign Speech Perception (journals.plos.org)
Bilinguals and Accents | Psychology Today
Bilinguals and Accents | Psychology Today (psychologytoday.com)
 - YouTube
Uofmemphisvideos - YouTube (youtube.com)
Frontiers | Intracochlear Recordings of Acoustically and Electrically Evoked Potentials in Nucleus Hybrid L24 Cochlear Implant...
Frontiers | Intracochlear Recordings of Acoustically and Electrically Evoked Potentials in Nucleus Hybrid L24 Cochlear Implant... (frontiersin.org)
Jeremy I Skipper - Mendeley
Jeremy I Skipper - Mendeley (mendeley.com)
Lecturer in Speech Perception/Production - University College London - jobs.ac.uk
Lecturer in Speech Perception/Production - University College London - jobs.ac.uk (jobs.ac.uk)
Modelling of dynamic perception and action | Max Planck Institute for Human Cognitive and Brain Sciences
Modelling of dynamic perception and action | Max Planck Institute for Human Cognitive and Brain Sciences (cbs.mpg.de)
September 17, 2005 | Science News
September 17, 2005 | Science News (sciencenews.org)
Categorical Speech Processing in Broca's Area: An fMRI Study Using Multivariate Pattern-Based Analysis | Journal of Neuroscience
Categorical Speech Processing in Broca's Area: An fMRI Study Using Multivariate Pattern-Based Analysis | Journal of Neuroscience (jneurosci.org)
Articles by Nina Kraus : The Hearing Journal
Articles by Nina Kraus : The Hearing Journal (journals.lww.com)
Research Groups
Research Groups (upf.edu)
Speech Science Primer: Physiology, Acoustics, and Perception of Speech / Edition 6 by Lawrence J. Raphael PhD,  Gloria J....
Speech Science Primer: Physiology, Acoustics, and Perception of Speech / Edition 6 by Lawrence J. Raphael PhD, Gloria J.... (barnesandnoble.com)
The Speech Processing Lexicon
The Speech Processing Lexicon (degruyter.com)
Perception - Wikipedia
Perception - Wikipedia (en.wikipedia.org)
CiteSeerX - Citation Query A Distributed, Developmental Model of Word Recognition and Naming.
CiteSeerX - Citation Query A Distributed, Developmental Model of Word Recognition and Naming. (citeseerx.ist.psu.edu)
EUROSPEECH'93 Abstract: Young et al.
EUROSPEECH'93 Abstract: Young et al. (isca-speech.org)
Infant Development: The Essential Readings | Infancy | Developmental Psychology | Psychology | Subjects | Wiley
Infant Development: The Essential Readings | Infancy | Developmental Psychology | Psychology | Subjects | Wiley (wiley.com)
Prenatal exposure to antidepressants and depressed maternal mood alter trajectory of infant speech perception | PNAS
Prenatal exposure to antidepressants and depressed maternal mood alter trajectory of infant speech perception | PNAS (pnas.org)
Xpert search results for Why study the Didache?start=18840
Xpert search results for Why study the Didache?start=18840 (nottingham.ac.uk)
The Sounds of Language: An Introduction to Phonetics and Phonology | Phonology | Theoretical Linguistics | General &...
The Sounds of Language: An Introduction to Phonetics and Phonology | Phonology | Theoretical Linguistics | General &... (wiley.com)
Information theory - Applications of information theory | mathematics | Britannica.com
Information theory - Applications of information theory | mathematics | Britannica.com (britannica.com)
Cambridge handbook psycholinguistics | Cognition | Cambridge University Press
Cambridge handbook psycholinguistics | Cognition | Cambridge University Press (cambridge.org)
Brain Sciences | Free Full-Text | Musical Expertise and Second Language Learning
Brain Sciences | Free Full-Text | Musical Expertise and Second Language Learning (mdpi.com)
JMIR - Speech Perception Benefits of Internet Versus Conventional Telephony for Hearing-Impaired Individuals | Mantokoudis |...
JMIR - Speech Perception Benefits of Internet Versus Conventional Telephony for Hearing-Impaired Individuals | Mantokoudis |... (jmir.org)
Computational Perception Laboratory: Segmentally-Boosted HMMs
Computational Perception Laboratory: Segmentally-Boosted HMMs (cc.gatech.edu)
The Handbook of Phonetic Sciences, 2nd Edition | Phonetics | Theoretical Linguistics | General & Introductory Linguistics |...
The Handbook of Phonetic Sciences, 2nd Edition | Phonetics | Theoretical Linguistics | General & Introductory Linguistics |... (wiley.com)
Wilder Penfield, Neural Cartographer | ScienceBlogs
Wilder Penfield, Neural Cartographer | ScienceBlogs (scienceblogs.com)
Hearing | Encyclopedia.com
Hearing | Encyclopedia.com (encyclopedia.com)
Mouth dries up without this calcium channel - Futurity
Mouth dries up without this calcium channel - Futurity (futurity.org)
Advances in the Spoken Language Development of Deaf and Hard-of-Hearing Children - Oxford Scholarship
Advances in the Spoken Language Development of Deaf and Hard-of-Hearing Children - Oxford Scholarship (oxfordscholarship.com)