Sound Localization PSY 310 Greg Francis Lecture 31 Physics and psychology. Audition We now have some idea of how sound properties are recorded by the auditory system So, we know what kind of information the nervous system has to work with We now return to the sound stimulus and look at what kinds of information in the stimulus provides information about the environment Sound localization Where does a sound come from in the environment? Demonstration 1
Localization In many respects, vision is better suited for localization than audition Light sources in different places in the environment project to different places on the retina Visual localization Of course, it s not trivial to figure out the distance But at least we can get the relative direction of lights with information that is on the retina 2
Auditory localization It seems worse for the auditory system The basilar membrane has no representation of position of a sound Only frequency and amplitude (size of wave) Coordinates The location of a source of sound can be described by three coordinates Azimuth (left-right, relative to some reference, often the front of the face) Elevation (up-down, relative to some reference, often the front of the face) Distance (how far away) Vision easily deals with the first two 3
Judging the azimuth Just as we use two eyes to estimate distance for vision We can use two ears to estimate azimuth for audition Two main cues for azimuth location Interaural time difference: sound hits one ear before the other Interaural level difference: the sound hitting one ear is stronger than the sound hitting the other ear Time difference Sounds from straight ahead reach both ears at the same time Sounds to the side reach the closer ear a bit sooner than the farther ear 4
Prof. Greg Francis Time difference The time differences are small, but they are enough for the auditory system to judge the left-right position of the sound source The speed of sound is around 340 m/s (it depends on air pressure, temperature, humidity, ) Time difference Neurons in the brain s auditory system respond to neural responses with the right kind of delay Superior olive Think of Reichardt detectors for motion perception 5
Level difference A sound off to one side reaches the close ear unimpeded To reach the opposite ear, the sound has to travel around the head This is not a problem for a relatively low frequency sound Level difference A sound off to one side reaches the close ear unimpeded To reach the opposite ear, the sound has to travel around the head It can be a problem for a high frequency sound Acoustic shadows (physics) 6
Level difference High frequency sounds are reduced in amplitude on the far side of the head How much depends on the frequency and on the location of the sound source Level difference Recordings of sounds at different ears shows some evidence of this effect 7
Prof. Greg Francis Judging elevation The interaural cues do not help to judge elevation Instead the auditory system takes advantages of reflections and interactions from the head, shoulders, and outer ear Consider the ear Judging elevation The folds and crevices of the pinnae will produce reflections Different frequencies of sounds will reflect differently It s physics again, it depends on the size of the folds 8
DTF There are similar reflections and interference from the shape of the head and the shoulders Farther away body parts don t usually make much difference We can characterize the effect as a directional transfer function (DFT) The impact of reflections on reducing different frequencies of sound DTF The DFT depends on the vertical location of the source of sound Sometimes also called the HRTF (head-related transfer function) The three curves have different shapes This means that they affect frequencies differently So, the frequencies of a sound can provide information about it s vertical location Exactly how it works is too complicated to get in to 9
Distance Similar to distance perception with vision, we depend on various cues to estimate distance of sound sources Interaural level difference Overall sound level Frequency Movement Parallax Reflection Interaural level difference The intensity of sound falls off with an inverse square law This means the ILD between the ears will be large if the sound is close and small if the sound is farther away 10
Interaural level difference If a sound is within arm s length, the ILD can be used to give a pretty good estimate of the distance Beyond that distance, the ILD can only say that the source is farther away than arm s length Overall sound level Many sounds have characteristic source intensities E.g., speech tends to be between 40-70 db If you know the normal intensity of a sound, then you can judge the distance by comparing what you hear to what is normal 11
Frequency As sound travels through air, some frequencies transmit better than others Air tends to absorb the energy in high frequency sound more than low frequency sounds Sounds that are far away are more dull or muffled than near sounds Kind of like the atmospheric perspective effects with light Movement parallax If you move, then the projection of light from nearby objects moves more quickly on the retina than for far objects Similar behavior if objects are moving 12
Movement parallax Likewise, if you can use sound cues to identify the azimuth positions of objects Then those positions will change differently according to their distance from you Reflection Reflections lead to differences in time when the sound(s) reach your ears These time differences tend to degrade the sound This can be a cue to distance But reflections tend to cause more problems than help How do you know the source s location is from the direct sound wave instead of the reflection? 13
Reflections The auditory system distinguishes the direct sound wave from reflections by noting when they arrive Reflections should always take longer to arrive than the direct sound wave Precedence For determining the location of a sound source, the auditory system gives precedence to the apparent source of the first sound that arrives Set up two speakers and play a sound If simultaneous sounds, the location is between the speakers 14
Precedence For determining the location of a sound source, the auditory system gives precedence to the apparent source of the first sound that arrives Set up two speakers and play a sound If one speakers lags the other just a little bit, the sound location shifts toward the leading speaker Precedence For determining the location of a sound source, the auditory system gives precedence to the apparent source of the first sound that arrives Set up two speakers and play a sound With a longer lag, the sounds appears to come only from the leading speaker 15
Precedence For determining the location of a sound source, the auditory system gives precedence to the apparent source of the first sound that arrives Set up two speakers and play a sound With a still longer lag, one hears two sounds Conclusions Sound localization involves many different cues in the sound stimulus Azimuth, elevation, distance 16
Next time Sound quality Acoustics Listening Auditory grouping 17