Sound Localization: Ability to determine the specific location of a sound source.Sound: A type of non-ionizing radiation in which energy is transmitted through solid, liquid, or gas as compression waves. Sound (acoustic or sonic) radiation with frequencies above the audible range is classified as ultrasonic. Sound radiation below the audible range is classified as infrasonic.Auditory Pathways: NEURAL PATHWAYS and connections within the CENTRAL NERVOUS SYSTEM, beginning at the hair cells of the ORGAN OF CORTI, continuing along the eighth cranial nerve, and terminating at the AUDITORY CORTEX.Acoustic Stimulation: Use of sound to elicit a response in the nervous system.Strigiformes: An order of BIRDS with the common name owls characterized by strongly hooked beaks, sharp talons, large heads, forward facing eyes, and facial disks. While considered nocturnal RAPTORS, some owls do hunt by day.Auditory Perception: The process whereby auditory stimuli are selected, organized, and interpreted by the organism.Ear: The hearing and equilibrium system of the body. It consists of three parts: the EXTERNAL EAR, the MIDDLE EAR, and the INNER EAR. Sound waves are transmitted through this organ where vibration is transduced to nerve signals that pass through the ACOUSTIC NERVE to the CENTRAL NERVOUS SYSTEM. The inner ear also contains the vestibular organ that maintains equilibrium by transducing signals to the VESTIBULAR NERVE.Inferior Colliculi: The posterior pair of the quadrigeminal bodies which contain centers for auditory function.Hearing Loss, Central: Hearing loss due to disease of the AUDITORY PATHWAYS (in the CENTRAL NERVOUS SYSTEM) which originate in the COCHLEAR NUCLEI of the PONS and then ascend bilaterally to the MIDBRAIN, the THALAMUS, and then the AUDITORY CORTEX in the TEMPORAL LOBE. Bilateral lesions of the auditory pathways are usually required to cause central hearing loss. Cortical deafness refers to loss of hearing due to bilateral auditory cortex lesions. Unilateral BRAIN STEM lesions involving the cochlear nuclei may result in unilateral hearing loss.Hearing: The ability or act of sensing and transducing ACOUSTIC STIMULATION to the CENTRAL NERVOUS SYSTEM. It is also called audition.Psychoacoustics: The science pertaining to the interrelationship of psychologic phenomena and the individual's response to the physical properties of sound.Auditory Threshold: The audibility limit of discriminating sound intensity and pitch.Olivary Nucleus: A part of the MEDULLA OBLONGATA situated in the olivary body. It is involved with motor control and is a major source of sensory input to the CEREBELLUM.Cues: Signals for an action; that specific portion of a perceptual field or pattern of stimuli to which a subject has learned to respond.Auditory Cortex: The region of the cerebral cortex that receives the auditory radiation from the MEDIAL GENICULATE BODY.Sound Spectrography: The graphic registration of the frequency and intensity of sounds, such as speech, infant crying, and animal vocalizations.Heart Sounds: The sounds heard over the cardiac region produced by the functioning of the heart. There are four distinct sounds: the first occurs at the beginning of SYSTOLE and is heard as a "lubb" sound; the second is produced by the closing of the AORTIC VALVE and PULMONARY VALVE and is heard as a "dupp" sound; the third is produced by vibrations of the ventricular walls when suddenly distended by the rush of blood from the HEART ATRIA; and the fourth is produced by atrial contraction and ventricular filling.Noise: Any sound which is unwanted or interferes with HEARING other sounds.Cochlear Nerve: The cochlear part of the 8th cranial nerve (VESTIBULOCOCHLEAR NERVE). The cochlear nerve fibers originate from neurons of the SPIRAL GANGLION and project peripherally to cochlear hair cells and centrally to the cochlear nuclei (COCHLEAR NUCLEUS) of the BRAIN STEM. They mediate the sense of hearing.Cochlear Nucleus: The brain stem nucleus that receives the central input from the cochlear nerve. The cochlear nucleus is located lateral and dorsolateral to the inferior cerebellar peduncles and is functionally divided into dorsal and ventral parts. It is tonotopically organized, performs the first stage of central auditory processing, and projects (directly or indirectly) to higher auditory areas including the superior olivary nuclei, the medial geniculi, the inferior colliculi, and the auditory cortex.Functional Laterality: Behavioral manifestations of cerebral dominance in which there is preferential use and superior functioning of either the left or the right side, as in the preferred use of the right hand or right foot.Acoustics: The branch of physics that deals with sound and sound waves. In medicine it is often applied in procedures in speech and hearing studies. With regard to the environment, it refers to the characteristics of a room, auditorium, theatre, building, etc. that determines the audibility or fidelity of sounds in it. (From Random House Unabridged Dictionary, 2d ed)Birds: Warm-blooded VERTEBRATES possessing FEATHERS and belonging to the class Aves.Ear, External: The outer part of the hearing system of the body. It includes the shell-like EAR AURICLE which collects sound, and the EXTERNAL EAR CANAL, the TYMPANIC MEMBRANE, and the EXTERNAL EAR CARTILAGES.Head: The upper part of the human body, or the front or upper part of the body of an animal, typically separated from the rest of the body by a neck, and containing the brain, mouth, and sense organs.Amplifiers, Electronic: Electronic devices that increase the magnitude of a signal's power level or current.Gerbillinae: A subfamily of the Muridae consisting of several genera including Gerbillus, Rhombomys, Tatera, Meriones, and Psammomys.Ear Protective Devices: Personal devices for protection of the ears from loud or high intensity noise, water, or cold. These include earmuffs and earplugs.Hearing Loss, Bilateral: Partial hearing loss in both ears.Hearing Tests: Part of an ear examination that measures the ability of sound to reach the brain.Brain Stem: The part of the brain that connects the CEREBRAL HEMISPHERES with the SPINAL CORD. It consists of the MESENCEPHALON; PONS; and MEDULLA OBLONGATA.Gryllidae: The family Gryllidae consists of the common house cricket, Acheta domesticus, which is used in neurological and physiological studies. Other genera include Gryllotalpa (mole cricket); Gryllus (field cricket); and Oecanthus (tree cricket).Echolocation: An auditory orientation mechanism involving the emission of high frequency sounds which are reflected back to the emitter (animal).Head Movements: Voluntary or involuntary motion of head that may be relative to or independent of body; includes animals and humans.Evoked Potentials, Auditory: The electric response evoked in the CEREBRAL CORTEX by ACOUSTIC STIMULATION or stimulation of the AUDITORY PATHWAYS.Trichechus manatus: Member of the genus Trichechus inhabiting the coast and coastal rivers of the southeastern United States as well as the West Indies and the adjacent mainland from Vera Cruz, Mexico to northern South America. (From Scott, Concise Encyclopedia Biology, 1996)Neurons: The basic cellular units of nervous tissue. Each neuron consists of a body, an axon, and dendrites. Their purpose is to receive, conduct, and transmit impulses in the NERVOUS SYSTEM.Cochlear Implants: Electronic hearing devices typically used for patients with normal outer and middle ear function, but defective inner ear function. In the COCHLEA, the hair cells (HAIR CELLS, VESTIBULAR) may be absent or damaged but there are residual nerve fibers. The device electrically stimulates the COCHLEAR NERVE to create sound sensation.Cats: The domestic cat, Felis catus, of the carnivore family FELIDAE, comprising over 30 different breeds. The domestic cat is descended primarily from the wild cat of Africa and extreme southwestern Asia. Though probably present in towns in Palestine as long ago as 7000 years, actual domestication occurred in Egypt about 4000 years ago. (From Walker's Mammals of the World, 6th ed, p801)Space Perception: The awareness of the spatial properties of objects; includes physical space.Time Perception: The ability to estimate periods of time lapsed or duration of time.Nuclear Localization Signals: Short, predominantly basic amino acid sequences identified as nuclear import signals for some proteins. These sequences are believed to interact with specific receptors at the NUCLEAR PORE.Evoked Potentials, Auditory, Brain Stem: Electrical waves in the CEREBRAL CORTEX generated by BRAIN STEM structures in response to auditory click stimuli. These are found to be abnormal in many patients with CEREBELLOPONTINE ANGLE lesions, MULTIPLE SCLEROSIS, or other DEMYELINATING DISEASES.Reaction Time: The time from the onset of a stimulus until a response is observed.Pitch Perception: A dimension of auditory sensation varying with cycles per second of the sound stimulus.Chiroptera: Order of mammals whose members are adapted for flight. It includes bats, flying foxes, and fruit bats.Biophysical Processes: Physical forces and actions in living things.Superior Colliculi: The anterior pair of the quadrigeminal bodies which coordinate the general behavioral orienting responses to visual stimuli, such as whole-body turning, and reaching.Orientation: Awareness of oneself in relation to time, place and person.Vision, Ocular: The process in which light signals are transformed by the PHOTORECEPTOR CELLS into electrical signals which can then be transmitted to the brain.Cochlear Implantation: Surgical insertion of an electronic hearing device (COCHLEAR IMPLANTS) with electrodes to the COCHLEAR NERVE in the inner ear to create sound sensation in patients with residual nerve fibers.Sensory Deprivation: The absence or restriction of the usual external sensory stimuli to which the individual responds.Models, Neurological: Theoretical representations that simulate the behavior or activity of the neurological system, processes or phenomena; includes the use of mathematical equations, computers, and other electronic equipment.Neural Inhibition: The function of opposing or restraining the excitation of neurons or their target excitable cells.Action Potentials: Abrupt changes in the membrane potential that sweep along the CELL MEMBRANE of excitable cells in response to excitation stimuli.Time Factors: Elements of limited time intervals, contributing to particular results or situations.Hearing Aids: Wearable sound-amplifying devices that are intended to compensate for impaired hearing. These generic devices include air-conduction hearing aids and bone-conduction hearing aids. (UMDNS, 1999)Cochlea: The part of the inner ear (LABYRINTH) that is concerned with hearing. It forms the anterior part of the labyrinth, as a snail-like structure that is situated almost horizontally anterior to the VESTIBULAR LABYRINTH.Ferrets: Semidomesticated variety of European polecat much used for hunting RODENTS and/or RABBITS and as a laboratory animal. It is in the subfamily Mustelinae, family MUSTELIDAE.Eye Movements: Voluntary or reflex-controlled movements of the eye.Adaptation, Physiological: The non-genetic biological changes of an organism in response to challenges in its ENVIRONMENT.Brain Mapping: Imaging techniques used to colocalize sites of brain functions or physiological activity with brain structures.Cell Nucleus: Within a eukaryotic cell, a membrane-limited body which contains chromosomes and one or more nucleoli (CELL NUCLEOLUS). The nuclear membrane consists of a double unit-type membrane which is perforated by a number of pores; the outermost membrane is continuous with the ENDOPLASMIC RETICULUM. A cell may contain more than one nucleus. (From Singleton & Sainsbury, Dictionary of Microbiology and Molecular Biology, 2d ed)Psychomotor Performance: The coordination of a sensory or ideational (cognitive) process and a motor activity.Molecular Sequence Data: Descriptions of specific amino acid, carbohydrate, or nucleotide sequences which have appeared in the published literature and/or are deposited in and maintained by databanks such as GENBANK, European Molecular Biology Laboratory (EMBL), National Biomedical Research Foundation (NBRF), or other sequence repositories.Neuronal Plasticity: The capacity of the NERVOUS SYSTEM to change its reactivity as the result of successive activations.Amino Acid Sequence: The order of amino acids as they occur in a polypeptide chain. This is referred to as the primary structure of proteins. It is of fundamental importance in determining PROTEIN CONFORMATION.Protein Transport: The process of moving proteins from one cellular compartment (including extracellular) to another by various sorting and transport mechanisms such as gated transport, protein translocation, and vesicular transport.Behavior, Animal: The observable response an animal makes to any situation.Electrodes, Implanted: Surgically placed electric conductors through which ELECTRIC STIMULATION is delivered to or electrical activity is recorded from a specific point inside the body.Subcellular Fractions: Components of a cell produced by various separation techniques which, though they disrupt the delicate anatomy of a cell, preserve the structure and physiology of its functioning constituents for biochemical and ultrastructural analysis. (From Alberts et al., Molecular Biology of the Cell, 2d ed, p163)Cytoplasm: The part of a cell that contains the CYTOSOL and small structures excluding the CELL NUCLEUS; MITOCHONDRIA; and large VACUOLES. (Glick, Glossary of Biochemistry and Molecular Biology, 1990)Photic Stimulation: Investigative technique commonly used during ELECTROENCEPHALOGRAPHY in which a series of bright light flashes or visual patterns are used to elicit brain activity.Saccades: An abrupt voluntary shift in ocular fixation from one point to another, as occurs in reading.Electric Stimulation: Use of electric potential or currents to elicit biological responses.Synapses: Specialized junctions at which a neuron communicates with a target cell. At classical synapses, a neuron's presynaptic terminal releases a chemical transmitter stored in synaptic vesicles which diffuses across a narrow synaptic cleft and activates receptors on the postsynaptic membrane of the target cell. The target may be a dendrite, cell body, or axon of another neuron, or a specialized region of a muscle or secretory cell. Neurons may also communicate via direct electrical coupling with ELECTRICAL SYNAPSES. Several other non-synaptic chemical or electric signal transmitting processes occur via extracellular mediated interactions.Electrophysiology: The study of the generation and behavior of electrical charges in living organisms particularly the nervous system and the effects of electricity on living organisms.Animals, Newborn: Refers to animals in the period of time just after birth.Respiratory Sounds: Noises, normal and abnormal, heard on auscultation over any part of the RESPIRATORY TRACT.Excitatory Postsynaptic Potentials: Depolarization of membrane potentials at the SYNAPTIC MEMBRANES of target neurons during neurotransmission. Excitatory postsynaptic potentials can singly or in summation reach the trigger threshold for ACTION POTENTIALS.Heart Auscultation: Act of listening for sounds within the heart.Hypothermia, Induced: Abnormally low BODY TEMPERATURE that is intentionally induced in warm-blooded animals by artificial means. In humans, mild or moderate hypothermia has been used to reduce tissue damages, particularly after cardiac or spinal cord injuries and during subsequent surgeries.Fixation, Ocular: The positioning and accommodation of eyes that allows the image to be brought into place on the FOVEA CENTRALIS of each eye.Glycine: A non-essential amino acid. It is found primarily in gelatin and silk fibroin and used therapeutically as a nutrient. It is also a fast inhibitory neurotransmitter.Recombinant Fusion Proteins: Recombinant proteins produced by the GENETIC TRANSLATION of fused genes formed by the combination of NUCLEIC ACID REGULATORY SEQUENCES of one or more genes with the protein coding sequences of one or more genes.Immunohistochemistry: Histochemical localization of immunoreactive substances using labeled antibodies as reagents.

*  Feedback loops and localization errors (PRESS RELEASE) - Fakultät für Biologie - LMU München

Adaptation in sound localization: from GABAB receptor-mediated synaptic modulation to perception. Annette Stange, Michael H ... Publication: Adaptation in sound localization: from GABAB receptor-mediated synaptic modulation to perception ... which is the primary cue for sound localization, is nonadaptive, as its outputs are mapped directly onto a hard-wired ... Feedback loops and localization errors (PRESS RELEASE). Stange A et al. (2013); Nature Neuroscience (Grothe Lab) ...

*  Training-induced plasticity of auditory localization in adult mammals. - Oxford Neuroscience

Here we show that mature ferrets can rapidly relearn to localize sounds after having their spatial cues altered by reversibly ... However, the factors that enable and promote plasticity of auditory localization in the adult brain are unknown. ... Accurate auditory localization relies on neural computations based on spatial cues present in the sound waves at each ear. The ... mature auditory system is therefore capable of adapting to abnormal spatial information by reweighting different localization ...

*  Sound localization at cocktail parties is easi... ( Milan Italy 30 June 2011 Difference...)

Sound,localization,at,cocktail,parties,is,easier,for,men,medicine,medical news today,latest medical news,medical newsletters, ... Sound localization at cocktail parties is easier for men. ...Milan Italy 30 June 2011 Differences in male and female behaviour ... Jrg Lewald, investigated the audio-spatial abilities in healthy men and women by means of a sound-localization task. ... Later, several sounds were presented simultaneously and participants had to focus on and localize only one sound. This is known ...

*  Sound localization with bilateral cochlear implants in noise: How much do head movements contribute to localization? - Zurich...

Sound localization with bilateral cochlear implants in noise: How much do head movements contribute to localization? ... Sound localization with bilateral cochlear implants in noise: How much do head movements contribute to localization? Cochlear ... In this study, the ability of bilateral CI users to use head movements to improve sound source localization was evaluated. ... In this study, the ability of bilateral CI users to use head movements to improve sound source localization was evaluated. ...

*  A Biologically Inspired Spiking Neural Network for Sound Localization by the Inferior Colliculus - SURE

A Biologically Inspired Spiking Neural Network for Sound Localization by the Inferior Colliculus ... A Biologically Inspired Spiking Neural Network for Sound Localization by the Inferior Colliculus. In: 18th International ...

*  Tinnitus Miracle Book: Prospective case‐controlled sound localization study after cochlear implantation in adults with single...

Prospective case‐controlled sound localization study after cochlear implantation in adults with single‐sided deafness and ... ConclusionsSubjects can better locate sound in the CION condition than in the CIOFF condition.This article is protected by ...

*  Nuit Blanche: Localization of Sound Sources in a Room with One Microphone - implementation

Localization of Sound Sources in a Room with One Microphone by Helena Peic Tukuljac, Herve Lissek, Pierre Vandergheynst ... What is especially interesting about our solution is that we provide localization of the sound sources not only in the ... This repository contains a library for sparse representation of the room transfer function and code for localization of sound ... which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in ...

*  Plus it

1993) Sound localization in acallosal human listeners. Brain 116:53-69, doi:10.1093/brain/116.1.53, pmid:8453465. ... 2010) Mechanisms of sound localization in mammals. Physiol Rev 90:983-1012, doi:10.1152/physrev.00026.2009, pmid:20664077. ... Layer 5 microcircuit: sound localization model. The existence of uneven excitatory and inhibitory microcircuits between CCort ... The ability of the AC to perform sound localization processes that underlie spatial hearing depends on differences in timing ...

*  Plus it

Sound Localization Under Perturbed Binaural Hearing Message Subject (Your Name) has sent you a message from Journal of ... Sound Localization Under Perturbed Binaural Hearing. Marc M Van Wanrooij, A. John Van Opstal ...

*  Auditory perception of sound reflections and source localization in dynamic scenes - RWTH AACHEN UNIVERSITY Institute of...

Auditory perception of sound reflections and source localization in dynamic scenes. Key Info. Basic Information. Professorship: ... You Are Here:Auditory perception of sound reflections and source localization in dynamic scenes ... You Are Here: Auditory perception of sound reflections and source localization in dynamic scenes ... Experiments will be carried out to determine the influence of the distance between sound source and test subject, and the head ...

*  Students | Channels - McGill University

Sound localization and auditory distance evaluation in people with visual and hearing impairment 4Nov200216:00 ... Read more about Sound localization and auditory distance evaluation in people with visual and hearing impairment ...

*  Project Loon - Tag Search - IEEE Spectrum

Video Friday: MIT Mini Cheetah, Jibo Sound Localization, and BB-8 Meets Mars Rover. Your weekly selection of awesome robot ... Loon

*  Plus it

Sound-intensity-dependent compensation for the small interaural time difference cue for sound source localization. J Neurosci ... Encoding of interaural timing for sound localization. In: The Senses: A Comprehensive Reference, edited by Dallos P, Oertel D. ... Mechanisms of sound localization in mammals. Physiol Rev 90: 983-1012, 2010. ... some physiological mechanisms of sound localization. J Neurophysiol 32: 613-636, 1969. ...


Patent application title: SYSTEMS, METHODS, APPARATUS, AND COMPUTER-READABLE MEDIA FOR SOURCE LOCALIZATION USING AUDIBLE SOUND ... 0093] In sound recording applications (e.g., for voice communications), using emissions of audible sound to support active ... Dogs can typically hear sounds up to 40,000 Hz, cats can typically hear sounds up to 60,000 Hz, and rodents can typically hear ... 18. The method according to claim 1, wherein the audio-frequency component includes sound emitted by the sound-emitting object ...

*  Patent US4484345 - Prosthetic device for optimizing speech understanding through adjustable ... - Google Patents

Sound localization in binaural hearing aids. EP2239958A3 *. Mar 8, 2010. Jun 24, 2015. LG Electronics Inc.. An apparatus for ... Bbe Sound Inc.. Low input signal bandwidth compressor and amplifier control circuit with a state variable pre-amplifier. ... Method and system for sound monitoring over a network. US8976991. Apr 30, 2010. Mar 10, 2015. Hear-Wear Technologies, Llc. BTE/ ... Method and system for sound monitoring over a network. USRE42737. Jan 10, 2008. Sep 27, 2011. Akiba Electronics Institute Llc. ...,069,055

*  AES E-Library Search Results

Subject:Sound Localization in 3D Space Click to purchase paper as a non-member or login as an AES member. If your company or ... Subject:Sound Localization in 3D Space Click to purchase paper as a non-member or login as an AES member. If your company or ... Subject:Sound Localization in 3D Space Click to purchase paper as a non-member or login as an AES member. If your company or ... This has paved the way for Virtual Reality sound where precision of sound is necessary for complete immersion in a virtual ... 2016 AES International Conference on Audio for Virtual and Augmented Reality

*  Ask an Expert: Grades 9-12: Math and Computer Science

Sound localization Last post by onur123 « Sat Oct 29, 2016 5:47 pm ... sound simulation in closed chamber Last post by MadelineB « Sun Oct 09, 2016 1:50 pm ...

*  The Munc13 Proteins Differentially Regulate Readily Releasable Pool Dynamics and Calcium-Dependent Recovery at a Central...

2010) Mechanisms of sound localization in mammals. Physiol Rev 90:983-1012, doi:10.1152/physrev.00026.2009, pmid:20664077. ... to ensure synaptic reliability at high rates for proper sound localization. Using Munc13-2, Munc13-3, and Munc13-2/3 deletion ... 2B-D, 3A--C), in agreement with previous reports on Munc13 localization (Kalla et al., 2006; Cooper et al., 2012). Munc13-1- ... Munc13 isoform expression and localization at the calyx. Past studies have suggested that two isoforms of Munc13 are expressed ...

*  Audio Engineering

Neural models of sound localisation. *Assessment of components and codecs for music reproduction. ... The EPSRC-funded Making Sense of Sound project in converting sound data into forms understandable and actionable information by ... 3D sound can offer listeners the experience of "being there" at a live event, such as the Proms, butcurrently requires highly ... Salford is heading the work on perception of everyday sounds.. Audio signal processing. Audio signal processing is applied in ...

*  Tuning of timing in auditory axons - LMU Munich

... the decisive cues for sound localization arise from the fact that a given sound both arrives earlier at, and is perceived as ... the responses of the two ears to a given sound to higher processing centers is crucial for accurate localization of the sound ... An LMU team has shown that the axons of auditory neurons in the brainstem which respond to low and high-frequency sounds differ ... axons that are most sensitive to low-frequency tones are larger in diameter than those that respond to high-frequency sounds ...

*  Application of spatial sound reproduction in virtual environments : experiments in localization, navigation, and orientation

spatial sound reproduction, virtual reality, localization of sound sources, navigation, orientation, keinotodellisuus, tilaääni ... Localization of the moving sound sources was not as accurate as localization of the static sources. ... Localization of a moving virtual sound source in a virtual room, the effect of a distracting auditory stimulus. In: Proceedings ... Static and dynamic sound source localization in a virtual room. In: Proceedings of the AES 22nd International Conference on ...

*  Echolocation Demonstration | Science Project |

Have a blindfolded subject locate you by the sound of your clapping. ... Demonstrate how we are able to locate sounds, echolocation. ... The Psychophysics of Human Sound Localization. Cambridge, ... But, together, both ears are able to detect sound location through minute differences in timing. If a sound is coming from our ... Each time you clap the subject should turn and face the direction that they think the sound is coming from. Do this several ...

*  Exams | Intensive Neuroanatomy | Brain and Cognitive Sciences | MIT OpenCourseWare

c. Such damage will cause problems with sound localization. d. Such damage will cause no problems ... Name one station in the auditory pathway responsible for sound localization ____________ (be specific). ... 38) Name one station in the auditory pathway responsible for tonotopic sound identification ____________ (be specific).. ...

*  auditory cortex

sound localization*auditory threshold*neurons*music*tinnitus*attention*temporal lobe*visual cortex*psychoacoustics*nerve net* ... Sound localization is a fundamental task of the auditory system, and in higher mammals, requires an intact auditory cortex. ... An apparent paradox exists, where lesions of the auditory cortex result in profound sound localization deficits, but the ... We tested speech sound discrimination in rats and recorded primary auditory cortex (A1) responses to speech sounds in ...

*  ADS Bibliographic Codes: Refereed Publications

Journal of Sound Vibration JSAES Journal of South American Earth Sciences JSLR Journal of Soviet Laser Research JSAR Journal of ... Disordered Systems and Localization DImTe Display Imaging Technology Displ Displays DSE Distributed Systems Engineering Dlib D- ... Sound-Flow Interactions SAJPh South African Journal of Physics SAJSc South African Journal of Science SvAer Soviet Aeronautics ... Anderson Localization and Its Ramifications: Disorder, Phase Coherence and Electron Correlations AnP Annalen der Physik AnMet ...

Masakazu Konishi: Gruber Prize in Neuroscience Yamashina AwardSound changeDelay line memory: Delay line memory is a form of computer memory, now obsolete, that was used on some of the earliest digital computers. Like many modern forms of electronic computer memory, delay line memory was a refreshable memory, but as opposed to modern random-access memory, delay line memory was sequential-access.Auditory scene analysis: In psychophysics, auditory scene analysis (ASA) is a proposed model for the basis of auditory perception. This is understood as the process by which the human auditory system organizes sound into perceptually meaningful elements.Crystal earpiece: A crystal earpiece, is a type of piezoelectric earphone, producing sound by using a piezoelectric crystal, a material that changes its shape when electricity is applied to it. It is usually designed to plug into the ear canal of the user.Auditory neuropathy: Auditory neuropathy (AN) is a variety of hearing loss in which the outer hair cells within the cochlea are present and functional, but sound information is not faithfully transmitted to the auditory nerve and brain properly. Also known as Auditory Neuropathy/Auditory Dys-synchrony (AN/AD) or Auditory Neuropathy Spectrum Disorder (ANSD).Equivalent rectangular bandwidth: The equivalent rectangular bandwidth or ERB is a measure used in psychoacoustics, which gives an approximation to the bandwidths of the filters in human hearing, using the unrealistic but convenient simplification of modeling the filters as rectangular band-pass filters.Psychoacoustics: Psychoacoustics is the scientific study of sound perception. More specifically, it is the branch of science studying the psychological and physiological responses associated with sound (including speech and music).Cue stick: A cue stick (or simply cue, more specifically pool cue, snooker cue, or billiards cue), is an item of sporting equipment essential to the games of pool, snooker and carom billiards. It is used to strike a ball, usually the .Fourth heart soundList of noise topics: This is a list of noise topics.American Chopper (season 4)Cerebral hemisphere: The vertebrate cerebrum (brain) is formed by two cerebral hemispheres that are separated by a groove, the medial longitudinal fissure. The brain can thus be described as being divided into left and right cerebral hemispheres.Acoustics Research InstituteBird trapping: Bird trapping techniques to capture wild birds include a wide range of techniques that have their origins in the hunting of birds for food. While hunting for food does not require birds to be caught alive, some trapping techniques capture birds without harming them and are of use in ornithology research.Preauricular sinus and cystOptical amplifier: An optical amplifier is a device that amplifies an optical signal directly, without the need to first convert it to an electrical signal. An optical amplifier may be thought of as a laser without an optical cavity, or one in which feedback from the cavity is suppressed.Muzzle brake: A muzzle brake or recoil compensator is a device connected to the muzzle of a firearm or cannon that redirects propellant gases to counter recoil and unwanted rising of the barrel during rapid fire.Muzzle brake in the NRA Firearms Glossary The concept was introduced for artillery and was a common feature on many anti-tank guns, especially those in tanks, in order to reduce the area needed to take up the recoil stroke.Central tegmental tract: The central tegmental tractKamali A, Kramer LA, Butler IJ, Hasan KM. Diffusion tensor tractography of the somatosensory system in the human brainstem: initial findings using high isotropic spatial resolution at 3.Teleogryllus oceanicus: Teleogryllus oceanicus, commonly known as the Australian, Pacific or oceanic field cricket, is a cricket found across Oceania and in coastal Australia from Carnarvon in Western Australia and Rockhampton in north-east Queensland Otte, D. & Alexander, R.Human echolocation: Human echolocation is the ability of humans to detect objects in their environment by sensing echoes from those objects, by actively creating sounds – for example, by tapping their canes, lightly stomping their foot, snapping their fingers, or making clicking noises with their mouths – people trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size. This ability is used by some blind people for acoustic wayfinding, or navigating within their environment using auditory rather than visual cues.Fall Heads Roll: Fall Heads Roll is an album by The Fall, released in 2005. It was recorded at Gracieland Studios in Rochdale, UK and Gigantic Studios in New York, NY.Auditory event: Auditory events describe the subjective perception, when listening to a certain sound situation. This term was introduced by Jens Blauert (Ruhr-University Bochum) in 1966, in order to distinguish clearly between the physical sound field and the auditory perception of the sound.Port Manatee: Port Manatee is a deepwater seaport located in the eastern Gulf of Mexico at the entrance to Tampa Bay in northern Manatee County, Florida. It is one of Florida's largest deepwater seaports.HSD2 neurons: HSD2 neurons are a small group of neurons in the brainstem which are uniquely sensitive to the mineralocorticosteroid hormone aldosterone, through expression of HSD11B2. They are located within the caudal medulla oblongata, in the nucleus of the solitary tract (NTS).Cats in the United States: Many different species of mammal can be classified as cats (felids) in the United States. These include domestic cat (both house cats and feral), of the species Felis catus; medium-sized wild cats from the genus Lynx; and big cats from the genera Puma and Panthera.Precise Time and Time Interval: Precise Time and Time Interval (PTTI) is a Department of Defense Military Standard which details a mechanism and waveform for distributing highly accurate timing information.Frequency following response: Frequency following response (FFR), also referred to as Frequency Following Potential (FFP), is an evoked potential generated by periodic or nearly-periodic auditory stimuli.Burkard, R.Pitch spaceHorseshoe bat: Horseshoe bats make up the bat family Rhinolophidae. In addition to the single living genus, Rhinolophus, one extinct genus, Palaeonycteris, has been recognized.Fixation reflex: The fixation reflex is that concerned with attracting the eye on a peripheral object. For example, when a light shines in the periphery, the eyes shift gaze on it.Canon EOS 5Ventricular action potentialTemporal analysis of products: Temporal Analysis of Products (TAP), (TAP-2), (TAP-3) is an experimental technique for studyingBeltoneFerret: The ferret (Mustela putorius furo) is the domesticated form of the European polecat, a mammal belonging to the same genus as the weasel, Mustela of the family Mustelidae.Harris & Yalden 2008, pp.Maladaptation: A maladaptation () is a trait that is (or has become) more harmful than helpful, in contrast with an adaptation, which is more helpful than harmful. All organisms, from bacteria to humans, display maladaptive and adaptive traits.Coles PhillipsHomeostatic plasticity: In neuroscience, homeostatic plasticity refers to the capacity of neurons to regulate their own excitability relative to network activity, a compensatory adjustment that occurs over the timescale of days. Synaptic scaling has been proposed as a potential mechanism of homeostatic plasticity.Protein primary structure: The primary structure of a peptide or protein is the linear sequence of its amino acid structural units, and partly comprises its overall biomolecular structure. By convention, the primary structure of a protein is reported starting from the amino-terminal (N) end to the carboxyl-terminal (C) end.Protoplasm: Protoplasm is the living content of a cell that is surrounded by a plasma membrane. It is a general term for the cytoplasm.Saccade: A saccade ( , French for jerk) is quick, simultaneous movement of both eyes between two phases of fixation in the same direction.Cassin, B.Cortical stimulation mapping: Cortical stimulation mapping (often shortened to CSM) is a type of electrocorticography that involves a physically invasive procedure and aims to localize the function of specific brain regions through direct electrical stimulation of the cerebral cortex. It remains one of the earliest methods of analyzing the brain and has allowed researchers to study the relationship between cortical structure and systemic function.Silent synapse: In neuroscience, a silent synapse is an excitatory glutamatergic synapse whose postsynaptic membrane contains NMDA-type glutamate receptors but no AMPA-type glutamate receptors. These synapses are named "silent" because normal AMPA receptor-mediated signaling is not present, rendering the synapse inactive under typical conditions.Periodic current reversalCardiovascular examination: The Cardiovascular examination is a portion of the physical examination that involves evaluation of the cardiovascular system.Hypothermia cap: A hypothermia cap (also referred to as "cold cap" or "cooling cap") is a therapeutic device used to cool the human scalp. Its most prominent medical applications are in preventing or reducing alopecia in chemotherapy, and for preventing cerebral palsy in babies born with neonatal encephalopathy caused by hypoxic-ischemic encephalopathy (HIE).Glycine (plant): Glycine is a genus in the bean family Fabaceae. The best known species is the soybean (Glycine max).

(1/773) Midbrain combinatorial code for temporal and spectral information in concurrent acoustic signals.

All vocal species, including humans, often encounter simultaneous (concurrent) vocal signals from conspecifics. To segregate concurrent signals, the auditory system must extract information regarding the individual signals from their summed waveforms. During the breeding season, nesting male midshipman fish (Porichthys notatus) congregate in localized regions of the intertidal zone and produce long-duration (>1 min), multi-harmonic signals ("hums") during courtship of females. The hums of neighboring males often overlap, resulting in acoustic beats with amplitude and phase modulations at the difference frequencies (dFs) between their fundamental frequencies (F0s) and harmonic components. Behavioral studies also show that midshipman can localize a single hum-like tone when presented with a choice between two concurrent tones that originate from separate speakers. A previous study of the neural mechanisms underlying the segregation of concurrent signals demonstrated that midbrain neurons temporally encode a beat's dF through spike synchronization; however, spectral information about at least one of the beat's components is also required for signal segregation. Here we examine the encoding of spectral differences in beat signals by midbrain neurons. The results show that, although the spike rate responses of many neurons are sensitive to the spectral composition of a beat, virtually all midbrain units can encode information about differences in the spectral composition of beat stimuli via their interspike intervals (ISIs) with an equal distribution of ISI spectral sensitivity across the behaviorally relevant dFs. Together, temporal encoding in the midbrain of dF information through spike synchronization and of spectral information through ISI could permit the segregation of concurrent vocal signals.  (+info)

(2/773) Desynchronizing responses to correlated noise: A mechanism for binaural masking level differences at the inferior colliculus.

We examined the adequacy of decorrelation of the responses to dichotic noise as an explanation for the binaural masking level difference (BMLD). The responses of 48 low-frequency neurons in the inferior colliculus of anesthetized guinea pigs were recorded to binaurally presented noise with various degrees of interaural correlation and to interaurally correlated noise in the presence of 500-Hz tones in either zero or pi interaural phase. In response to fully correlated noise, neurons' responses were modulated with interaural delay, showing quasiperiodic noise delay functions (NDFs) with a central peak and side peaks, separated by intervals roughly equivalent to the period of the neuron's best frequency. For noise with zero interaural correlation (independent noises presented to each ear), neurons were insensitive to the interaural delay. Their NDFs were unmodulated, with the majority showing a level of activity approximately equal to the mean of the peaks and troughs of the NDF obtained with fully correlated noise. Partial decorrelation of the noise resulted in NDFs that were, in general, intermediate between the fully correlated and fully decorrelated noise. Presenting 500-Hz tones simultaneously with fully correlated noise also had the effect of demodulating the NDFs. In the case of tones with zero interaural phase, this demodulation appeared to be a saturation process, raising the discharge at all noise delays to that at the largest peak in the NDF. In the majority of neurons, presenting the tones in pi phase had a similar effect on the NDFs to decorrelating the noise; the response was demodulated toward the mean of the peaks and troughs of the NDF. Thus the effect of added tones on the responses of delay-sensitive inferior colliculus neurons to noise could be accounted for by a desynchronizing effect. This result is entirely consistent with cross-correlation models of the BMLD. However, in some neurons, the effects of an added tone on the NDF appeared more extreme than the effect of decorrelating the noise, suggesting the possibility of additional inhibitory influences.  (+info)

(3/773) Early visual experience shapes the representation of auditory space in the forebrain gaze fields of the barn owl.

Auditory spatial information is processed in parallel forebrain and midbrain pathways. Sensory experience early in life has been shown to exert a powerful influence on the representation of auditory space in the midbrain space-processing pathway. The goal of this study was to determine whether early experience also shapes the representation of auditory space in the forebrain. Owls were raised wearing prismatic spectacles that shifted the visual field in the horizontal plane. This manipulation altered the relationship between interaural time differences (ITDs), the principal cue used for azimuthal localization, and locations of auditory stimuli in the visual field. Extracellular recordings were used to characterize ITD tuning in the auditory archistriatum (AAr), a subdivision of the forebrain gaze fields, in normal and prism-reared owls. Prism rearing altered the representation of ITD in the AAr. In prism-reared owls, unit tuning for ITD was shifted in the adaptive direction, according to the direction of the optical displacement imposed by the spectacles. Changes in ITD tuning involved the acquisition of unit responses to adaptive ITD values and, to a lesser extent, the elimination of responses to nonadaptive (previously normal) ITD values. Shifts in ITD tuning in the AAr were similar to shifts in ITD tuning observed in the optic tectum of the same owls. This experience-based adjustment of binaural tuning in the AAr helps to maintain mutual registry between the forebrain and midbrain representations of auditory space and may help to ensure consistent behavioral responses to auditory stimuli.  (+info)

(4/773) Auditory perception: does practice make perfect?

Recent studies have shown that adult humans can learn to localize sounds relatively accurately when provided with altered localization cues. These experiments provide further evidence for experience-dependent plasticity in the mature brain.  (+info)

(5/773) Single cortical neurons serve both echolocation and passive sound localization.

The pallid bat uses passive listening at low frequencies to detect and locate terrestrial prey and reserves its high-frequency echolocation for general orientation. While hunting, this bat must attend to both streams of information. These streams are processed through two parallel, functionally specialized pathways that are segregated at the level of the inferior colliculus. This report describes functionally bimodal neurons in auditory cortex that receive converging input from these two pathways. Each brain stem pathway imposes its own suite of response properties on these cortical neurons. Consequently, the neurons are bimodally tuned to low and high frequencies, and respond selectively to both noise transients used in prey detection, and downward frequency modulation (FM) sweeps used in echolocation. A novel finding is that the monaural and binaural response properties of these neurons can change as a function of the sound presented. The majority of neurons appeared binaurally inhibited when presented with noise but monaural or binaurally facilitated when presented with the echolocation pulse. Consequently, their spatial sensitivity will change, depending on whether the bat is engaged in echolocation or passive listening. These results demonstrate that the response properties of single cortical neurons can change with behavioral context and suggest that they are capable of supporting more than one behavior.  (+info)

(6/773) Functional selection of adaptive auditory space map by GABAA-mediated inhibition.

The external nucleus of the inferior colliculus in the barn owl contains an auditory map of space that is based on the tuning of neurons for interaural differences in the timing of sound. In juvenile owls, this region of the brain can acquire alternative maps of interaural time difference as a result of abnormal experience. It has been found that, in an external nucleus that is expressing a learned, abnormal map, the circuitry underlying the normal map still exists but is functionally inactivated by inhibition mediated by gamma-aminobutyric acid type A (GABAA) receptors. This inactivation results from disproportionately strong inhibition of specific input channels to the network. Thus, experience-driven changes in patterns of inhibition, as well as adjustments in patterns of excitation, can contribute critically to adaptive plasticity in the central nervous system.  (+info)

(7/773) Sensitivity to simulated directional sound motion in the rat primary auditory cortex.

Sensitivity to simulated directional sound motion in the rat primary auditory cortex. This paper examines neuron responses in rat primary auditory cortex (AI) during sound stimulation of the two ears designed to simulate sound motion in the horizontal plane. The simulated sound motion was synthesized from mathematical equations that generated dynamic changes in interaural phase, intensity, and Doppler shifts at the two ears. The simulated sounds were based on moving sources in the right frontal horizontal quadrant. Stimuli consisted of three circumferential segments between 0 and 30 degrees, 30 and 60 degrees, and 60 and 90 degrees and four radial segments at 0, 30, 60, and 90 degrees. The constant velocity portion of each segment was 0.84 m long. The circumferential segments and center of the radial segments were calculated to simulate a distance of 2 m from the head. Each segment had two trajectories that simulated motion in both directions, and each trajectory was presented at two velocities. Young adult rats were anesthetized, the left primary auditory cortex was exposed, and microelectrode recordings were obtained from sound responsive cells in AI. All testing took place at a tonal frequency that most closely approximated the best frequency of the unit at a level 20 dB above the tuning curve threshold. The results were presented on polar plots that emphasized the two directions of simulated motion for each segment rather than the location of sound in space. The trajectory exhibiting a "maximum motion response" could be identified from these plots. "Neuron discharge profiles" within these trajectories were used to demonstrate neuron activity for the two motion directions. Cells were identified that clearly responded to simulated uni- or multidirectional sound motion (39%), that were sensitive to sound location only (19%), or that were sound driven but insensitive to our location or sound motion stimuli (42%). The results demonstrated the capacity of neurons in rat auditory cortex to selectively process dynamic stimulus conditions representing simulated motion on the horizontal plane. Our data further show that some cells were responsive to location along the horizontal plane but not sensitive to motion. Cells sensitive to motion, however, also responded best to the moving sound at a particular location within the trajectory. It would seem that the mechanisms underlying sensitivity to sound location as well as direction of motion converge on the same cell.  (+info)

(8/773) Influence of head position on the spatial representation of acoustic targets.

Sound localization in humans relies on binaural differences (azimuth cues) and monaural spectral shape information (elevation cues) and is therefore the result of a neural computational process. Despite the fact that these acoustic cues are referenced with respect to the head, accurate eye movements can be generated to sounds in complete darkness. This ability necessitates the use of eye position information. So far, however, sound localization has been investigated mainly with a fixed head position, usually straight ahead. Yet the auditory system may rely on head motor information to maintain a stable and spatially accurate representation of acoustic targets in the presence of head movements. We therefore studied the influence of changes in eye-head position on auditory-guided orienting behavior of human subjects. In the first experiment, we used a visual-auditory double-step paradigm. Subjects made saccadic gaze shifts in total darkness toward brief broadband sounds presented before an intervening eye-head movement that was evoked by an earlier visual target. The data show that the preceding displacements of both eye and head are fully accounted for, resulting in spatially accurate responses. This suggests that auditory target information may be transformed into a spatial (or body-centered) frame of reference. To further investigate this possibility, we exploited the unique property of the auditory system that sound elevation is extracted independently from pinna-related spectral cues. In the absence of such cues, accurate elevation detection is not possible, even when head movements are made. This is shown in a second experiment where pure tones were localized at a fixed elevation that depended on the tone frequency rather than on the actual target elevation, both under head-fixed and -free conditions. To test, in a third experiment, whether the perceived elevation of tones relies on a head- or space-fixed target representation, eye movements were elicited toward pure tones while subjects kept their head in different vertical positions. It appeared that each tone was localized at a fixed, frequency-dependent elevation in space that shifted to a limited extent with changes in head elevation. Hence information about head position is used under static conditions too. Interestingly, the influence of head position also depended on the tone frequency. Thus tone-evoked ocular saccades typically showed a partial compensation for changes in static head position, whereas noise-evoked eye-head saccades fully compensated for intervening changes in eye-head position. We propose that the auditory localization system combines the acoustic input with head-position information to encode targets in a spatial (or body-centered) frame of reference. In this way, accurate orienting responses may be programmed despite intervening eye-head movements. A conceptual model, based on the tonotopic organization of the auditory system, is presented that may account for our findings.  (+info)


  • In the auditory cortex (AC), interhemispheric communication is involved in sound localization processes underlying spatial hearing. (
  • This invention relates to the sound amplification arts and to their application in the amelioration of auditory deficiencies, both conductive and sensorineural in nature as related to the human ear. (
  • Although approximate compared to visual localization, auditory localization is paramount for VR: it is lighting condition-independent, omnidirectional, not as subject to occlusion, and creates presence. (
  • In a multi-part study, first-person horizontal movement between two virtual sound source locations in an auditory virtual environment (AVE) was investigated by evaluating the sensation of motion as perceived by the listener. (
  • An LMU team has shown that the axons of auditory neurons in the brainstem which respond to low and high-frequency sounds differ in their morphology, and that these variations correlate with differences in the speed of signal conduction. (

interaural time

  • interaural time disparities (ITDs) are the main cue that animals use to localize low frequency sounds. (
  • Nonetheless, it has long been assumed that the processing of interaural time differences, which is the primary cue for sound localization, is nonadaptive, as its outputs are mapped directly onto a hard-wired representation of space. (


  • Sound localization with bilateral cochlear implants in noise: How much do head movements contribute to localization? (
  • Bilateral cochlear implant (CI) users encounter difficulties in localizing sound sources in everyday environments, especially in the presence of background noise and reverberation. (


  • Ida Zndorf from the Center of Neurology at Tbingen University, together with Prof. Hans-Otto Karnath and Dr. Jrg Lewald, investigated the audio-spatial abilities in healthy men and women by means of a sound-localization task. (
  • Since this male advantage was only found in the cocktail party situation, i.e. women performed equally well when sounds were presented one at a time, this indicates that the difference is related to a "high attentional mechanism" in the brain specifically involved in extracting spatial information of one particular sound source in a noisy environment. (
  • Near Field compensated Higher Order Ambisonics (HOA) and Vector Base Amplitude Panning (VBAP) are investigated for both spatial accuracy and tonal coloration with moving sound source trajectories. (
  • Guidance about the interactions of subjective judgments and spatial presence for sound positioning is needed for non-specialists to leverage VR's spatial sound environment. (
  • We also have ambisonics and binaural reproduction systems which are being used on joint research projects with the BBC to investigate the future of spatial audio, such as adding height to surround sound systems and what happens when listeners aren't in the sweet spot . (


  • Later, several sounds were presented simultaneously and participants had to focus on and localize only one sound. (
  • In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. (


  • Furthermore, psychophysical tests showed that the paradigm used to evoke neuronal GABAB receptor-mediated adaptation causes the perceptual shift in sound localization in humans that was expected on the basis of our physiological results in gerbils. (
  • The EPSRC-funded Making Sense of Sound project in converting sound data into forms understandable and actionable information by humans and machines. (


  • This study examines the extent to which disparity in azimuth location between a sound cue and image target can be varied in cinematic virtual reality (VR) content, before presence is broken. (
  • It applies disparity consistently and inconsistently across five otherwise identical sound-image events. (


  • Participants were asked to listen to sounds and determine the location of the sound source, either by pointing towards it or by naming the exact position (e.g. 45 degrees left). (
  • This is known as the cocktail party phenomenon the human capacity to detect and focus on one particular sound source in a noisy environment. (
  • In this study, the ability of bilateral CI users to use head movements to improve sound source localization was evaluated. (
  • In complex environments, human beings use early boundary reflections (walls, ceilings, floors, large objects) and their relation to a sound source for an acoustic orientation. (
  • Experiments will be carried out to determine the influence of the distance between sound source and test subject, and the head movement on the maximum number of audible sources. (
  • Since both ears encode a given acoustic stimulus in the same way, the decisive cues for sound localization arise from the fact that a given sound both arrives earlier at, and is perceived as louder by the 'ipsilateral ear' (the one closer to the sound source) than the stimulus that reaches the 'contralateral' ear. (
  • Hence, precise communication of the timing difference between the responses of the two ears to a given sound to higher processing centers is crucial for accurate localization of the sound source. (


  • Salford is heading the work on perception of everyday sounds. (


  • What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room. (


  • Estimation of the location of sound sources is usually done using microphone arrays. (
  • Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. (
  • This repository contains a library for sparse representation of the room transfer function and code for localization of sound sources in a room with one microphone. (
  • This paper investigates the rendering of moving sound sources in the context of real-world loudspeaker arrays and virtual loudspeaker arrays for binaural listening in VR experiences. (


  • At first, sounds were presented one at a time and both men and women accomplished the task with great accuracy. (


  • Interestingly, women found the second task much more difficult, compared to men, to the extent that in some cases they even thought the sounds were coming from the opposite direction. (


  • We have three wavefield synthesis systems, including a portable system which has been used to recreate the prehistoric sounds of Stonehenge . (