Hearing II Perceptual Aspects

Similar documents
Hearing in the Environment

Sound Localization PSY 310 Greg Francis. Lecture 31. Audition

AUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening

Binaural Hearing. Why two ears? Definitions

3-D Sound and Spatial Audio. What do these terms mean?

Chapter 11: Sound, The Auditory System, and Pitch Perception

Sound localization psychophysics

Auditory scene analysis in humans: Implications for computational implementations.

J Jeffress model, 3, 66ff

Systems Neuroscience Oct. 16, Auditory system. http:

Auditory System & Hearing

Sound Waves. Sensation and Perception. Sound Waves. Sound Waves. Sound Waves

ID# Exam 2 PS 325, Fall 2003

Auditory Scene Analysis

Spectrograms (revisited)

HEARING AND PSYCHOACOUSTICS

Binaural Hearing. Steve Colburn Boston University

Hearing. Juan P Bello

Auditory Scene Analysis. Dr. Maria Chait, UCL Ear Institute

21/01/2013. Binaural Phenomena. Aim. To understand binaural hearing Objectives. Understand the cues used to determine the location of a sound source

Hearing Lectures. Acoustics of Speech and Hearing. Auditory Lighthouse. Facts about Timbre. Analysis of Complex Sounds

COM3502/4502/6502 SPEECH PROCESSING

Practice Test Questions

Before we talk about the auditory system we will talk about the sound and waves

USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES

= + Auditory Scene Analysis. Week 9. The End. The auditory scene. The auditory scene. Otherwise known as

16.400/453J Human Factors Engineering /453. Audition. Prof. D. C. Chandra Lecture 14

PSY 215 Lecture 10 Topic: Hearing Chapter 7, pages

Auditory Physiology PSY 310 Greg Francis. Lecture 30. Organ of Corti

Masker-signal relationships and sound level

Topic 4. Pitch & Frequency

Lecture 3: Perception

An Auditory System Modeling in Sound Source Localization

Running head: HEARING-AIDS INDUCE PLASTICITY IN THE AUDITORY SYSTEM 1

The Perceptual Experience

Unit 4: Sensation and Perception

Binaural Hearing for Robots Introduction to Robot Hearing

Chapter 1: Introduction to digital audio

Effect of spectral content and learning on auditory distance perception

Psychology Chapter 4. Sensation and Perception. Most amazing introduction ever!! Turn to page 77 and prepare to be amazed!

Sound and Hearing. Decibels. Frequency Coding & Localization 1. Everything is vibration. The universe is made of waves.

Topic 4. Pitch & Frequency. (Some slides are adapted from Zhiyao Duan s course slides on Computer Audition and Its Applications in Music)

HCS 7367 Speech Perception

Introduction to sensory pathways. Gatsby / SWC induction week 25 September 2017

ID# Exam 2 PS 325, Fall 2009

Sensation and Perception. A. Sensation: awareness of simple characteristics B. Perception: making complex interpretations

Spatial hearing and sound localization mechanisms in the brain. Henri Pöntynen February 9, 2016

Frequency refers to how often something happens. Period refers to the time it takes something to happen.

Dikran J. Martin. Psychology 110. Name: Date: Making Contact with the World around Us. Principal Features

What Is the Difference between db HL and db SPL?

Technical Discussion HUSHCORE Acoustical Products & Systems

to vibrate the fluid. The ossicles amplify the pressure. The surface area of the oval window is

Sensory Systems Vision, Audition, Somatosensation, Gustation, & Olfaction

The Structure and Function of the Auditory Nerve

Computational Perception /785. Auditory Scene Analysis

SPHSC 462 HEARING DEVELOPMENT. Overview Review of Hearing Science Introduction

Ch 5. Perception and Encoding

! Can hear whistle? ! Where are we on course map? ! What we did in lab last week. ! Psychoacoustics

Ganglion Cells Blind Spot Cornea Pupil Visual Area of the Bipolar Cells Thalamus Rods and Cones Lens Visual cortex of the occipital lobe

Receptors / physiology

= add definition here. Definition Slide

Infant Hearing Development: Translating Research Findings into Clinical Practice. Auditory Development. Overview

Auditory System. Barb Rohrer (SEI )

A. Acuity B. Adaptation C. Awareness D. Reception E. Overload

Ch 5. Perception and Encoding

ID# Final Exam PS325, Fall 1997

Signals, systems, acoustics and the ear. Week 5. The peripheral auditory system: The ear as a signal processor

Chapter 3. Sounds, Signals, and Studio Acoustics

Binaural processing of complex stimuli

How is the stimulus represented in the nervous system?

Definition Slides. Sensation. Perception. Bottom-up processing. Selective attention. Top-down processing 11/3/2013

The basic hearing abilities of absolute pitch possessors

Issues faced by people with a Sensorineural Hearing Loss

Neural correlates of the perception of sound source separation

Hearing. istockphoto/thinkstock

Hearing. Figure 1. The human ear (from Kessel and Kardon, 1979)

HearIntelligence by HANSATON. Intelligent hearing means natural hearing.

Human cogition. Human Cognition. Optical Illusions. Human cognition. Optical Illusions. Optical Illusions

Nature of the Sound Stimulus. Sound is the rhythmic compression and decompression of the air around us caused by a vibrating object.

Sensation and Perception

Auditory System & Hearing

HEARING. Structure and Function

Stimulus any aspect of or change in the environment to which an organism responds. Sensation what occurs when a stimulus activates a receptor

Sensing and Perceiving Our World

The Central Auditory System

Prof. Greg Francis 7/31/15

Chapter 4: Sensation and Perception The McGraw-Hill Companies, Inc.

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 THE DUPLEX-THEORY OF LOCALIZATION INVESTIGATED UNDER NATURAL CONDITIONS

PERIPHERAL AND CENTRAL AUDITORY ASSESSMENT

The Auditory Nervous System

Auditory Scene Analysis: phenomena, theories and computational models

Deafness and hearing impairment

The lowest level of stimulation that a person can detect. absolute threshold. Adapting one's current understandings to incorporate new information.

Processing in The Superior Olivary Complex

INTRODUCTION J. Acoust. Soc. Am. 103 (2), February /98/103(2)/1080/5/$ Acoustical Society of America 1080

BASIC NOTIONS OF HEARING AND

Vision and Audition. This section concerns the anatomy of two important sensory systems, the visual and the auditory systems.

Localization: Give your patients a listening edge

Vision Seeing is in the mind

IN EAR TO OUT THERE: A MAGNITUDE BASED PARAMETERIZATION SCHEME FOR SOUND SOURCE EXTERNALIZATION. Griffin D. Romigh, Brian D. Simpson, Nandini Iyer

Transcription:

Hearing II Perceptual Aspects Overview of Topics Chapter 6 in Chaudhuri Intensity & Loudness Frequency & Pitch Auditory Space Perception 1 2 Intensity & Loudness Loudness is the subjective perceptual quality of sound related most directly to pressure or intensity But loudness also depends on frequency of sound, among other things Absolute Detection Thresholds The pressure at which loudness is greater than 0 (i.e., absolute threshold) varies according to Hz Max. sensitivity at 2-5 khz, also where human speech has most energy 3 4

Dynamic Range Terminal Threshold Fundamental Concept: Psychometric vs. Psychophysical Functions Psychometric: Function of physical quantity & behaviour Shows one threshold Psychophysical Function: Function of two physical quantities Shows how threshold on one physical quantity varies as a function of the other 5 6 Four psychometric functions Two psychophysical functions Loudness Other factors affecting loudness: Binaural vs. Monaural presentation: Threshold is 6 db lower with two ears (spatial integration) Duration of sound: Threshold increases below 200 ms (temporal integration) 7 8

Fundamental Concept: Spatial & Temporal Integration Any sensor (biological or artificial) must integrate signals across a certain range of space and time Longer/larger integration provides greater sensitivity but lower acuity (details can be lost) Recall spatial integration in somatosensory systems Fundamental Concept: Spatial & Temporal Integration Any sensor (biological or artificial) must integrate (add up) signals across a certain range of space and time Longer/larger integration provides greater sensitivity but lower acuity (because details can be lost) Recall spatial integration in somatosensory systems 9 Questions? What is the minimal audibility curve? What is the dynamic range of the auditory system? What is a psychometric function? What is psychophysical function? Sound Masking Experiments A type of psychophysical experiment examining how one sound affects perception of another Two basic types: Tonal masking: How much does one pure tone impair perception of another? Noise masking: How much does aperiodic sound impair perception of tones? 11 12

Sound Masking Experiments Two basic methods: Constant Target Hz, Varying Masker Hz (typically used with tonal masking) Varying Target Hz, Constant Masker Hz (typically used with noise masking) Together, have revealed important things about how auditory system functions Tonal Masking Typically, a test tone of fixed frequency and amplitude is chosen, say 1000 Hz at 40 db Then a series of masking tones of different frequencies are presented For each mask, the observer adjusts its intensity (db) until it just drowns out the test tone 13 14 Tonal Masking Noise Masking Results shown for 1000 Hz test tone Most effective mask is at same frequency Note the asymmetry of off-frequency masks A constant noise mask is used Typically, this will be narrow-band white noise, meaning an aperiodic sound with frequencies in a certain limited range Example: Noise with energy at 410±45 Hz. We say this has a centre frequency of 410 Hz and a bandwidth of 90 Hz. 15 16

Noise Masking The effect of the noise mask on the absolute threshold for pure tones is measured This is done for test tones across the frequency spectrum Threshold is most elevated for test tones near the frequency of the noise Again, asymmetry is seen off-frequency Noise Masking Red: Baseline minimal audibility curve (no mask) Green: Masked minimal audibility curve (410±45 Hz noise mask) 17 18 We can subtract the baseline from the masked audibility curve This gives us the threshold elevation produced by the noise mask Threshold elevation shows asymmetry. The mask is more effective against higher Hz. Noise Masking Masking Asymmetries Why do masking asymmetries occur? Because of asymmetries in how the basilar membrane responds to pure tones Specifically, the travelling wave builds up gradually then collapses suddenly 19 20

Masking Asymmetries This means the responses to test and masking tones will be...... more similar if the mask is lower in frequency than the test...more different if the mask is higher in frequency than the test Questions? If a masking tone is 500 Hz in frequency: What Hz of target tone will it most effectively mask? Which target tone will it mask more: A 400 Hz tone, or a 600 Hz tone? What is the reason for the asymmetry in masking? 21 22 Frequency & Pitch Height & Chroma: Do-Re-Mi...Do-Re-Mi Relate to one another in a complex way. As frequency increases, one aspect of pitch, called tone height, increases linearly But another aspect, tone chroma, changes simultaneously in a circular fashion C4 D4 E4 F4 G4 A4 B4 C5 D5 E5 F5 G5 A5 B5 C6 D6 E6 F6 G6 A6 B6 23 24

Height & Chroma: Do-do-do, Re-re-re Harmony & Disharmony C4 D4 E4 F4 G4 A4 B4 C5 D5 E5 F5 G5 A5 B5 C6 D6 E6 F6 G6 A6 B6 C4 D4 E4 F4 G4 A4 B4 C5 D5 E5 F5 G5 A5 B5 C6 D6 E6 F6 G6 A6 B6 25 26 Auditory Localization Coordinates in Auditory Space Auditory space - surrounds an observer and exists wherever there is sound Locations in auditory space are defined by: Azimuth coordinates - position left to right Elevation coordinates - position up and down Distance coordinates - position from observer 27 28

Sound Localization Accuracy On average, people can localize sounds directly in front of them more accurately than those to the side or behind them In vision, direction cues are present on the retina 29 30 Questions In audition, direction cues are not present on the cochlea What are the three coordinates used to describe sound location? In what directions are humans most accurate in judging sound location? Least? 31 32

Cues for Sound Localization Binaural Cues: Interaural Time Difference (ITD) Interaural Intensity Difference (IID) Monaural cue: The Head-Related Transfer Function (HRTF) Binaural cues - location cues based on the comparison of the signals received by the left and right ears Interaural time difference (ITD) - difference between the times sounds reach the two ears When distance to each ear is the same, there are no differences in time Cues for Sound Location When the source is to the side of the observer, the times will differ 33 34 Interaural Time Differences Binaural Cues Interaural Intensity Difference (IID) difference in sound pressure level reaching the two ears Reduction in intensity occurs for high frequency sounds for the far ear The head casts an acoustic shadow This effect doesn t occur for low frequency sounds, which diffract around the head 35 36

Acoustic Shadow for High but not Low Frequencies Interaural Intensity Difference The higher the frequency of the sound, the greater the intensity (i.e., level) difference between the two ears when sounds are coming from the side. 37 38 A Given ITD/IID......Cannot, on its own, tell you where a sound is coming from...can only tell you it is coming from somewhere on a particular cone of points...we call these cones of confusion Monaural Cue for Sound Location: HRTF As sounds encounter the head, they are modified by its structures (bones, muscles, etc.), esp. the pinna Some frequencies are reduced in amplitude, others, due to resonance and constructive interference, are increased The pattern of increases and decreases is called the Head- Related Transfer Function (HRTF) Importantly, the HRTF differs depending on the elevation of the sound source 39 40

Monaural Cue for Sound Location: HRTF Head-related Transfer Function Thus, the head and pinna leave a unique frequency fingerprint on sounds This fingerprint varies based on the elevation coordinate helping to resolve which point on the cone of confusion is the source of the sound. However, distance remains to be resolved... 41 42 Hofmann Experiment on Judging Elevation Consider how each frequency element in the bassoon s sound would be affected by the HRTF for different elevations IID and ITD are not effective for judgments on elevation since in many locations they may be zero Experiment by Hofmann investigating spectral cues Listeners were measured for performance locating sounds differing in elevation They were then fitted with a mould that changed the shape of their pinnae 43 44

Results of Hofmann Exp on Elevation Judgments Right after the moulds were inserted, performance was poor After 19 days, performance was close to original performance Once the moulds were removed, performance stayed high This suggests that there might be two different sets of neurones one for each set of HRTF cues Results of Hofmann Exp on Elevation Judgments 45 46 Distance Cues for Sound Spectrum : High Hz are more quickly damped by air. So distant sources sound muffled. For sound with a known spectrum (e.g. speech), distance can be estimated by degree of muffling. Loudness: Distant sound sources have a lower loudness than close ones. This aspect can be evaluated especially for well-known sound sources. Movement: Motion parallax can be used in acoustical perception. Nearby sound sources pass faster than distant sound sources. Reflection: In enclosed areas, direct sound arrives at the listener's ears without being reflected off a wall. Reflected sound arrives later, after bouncing off a wall. Ratio between direct and reflected sound arrival times can give an indication about the distance of the sound source. Comparative Sound Localization Head-tilting is used in some animal to gain sound source elevation information. Owls have asymmetrical ears, allowing IIDs to be used for elevation (ITDs for azimuth) 47

Questions What are the main binaural cues for sound location? For which coordinates do they work? Which binaural cue works for high frequencies but not low ones? What is the HRTF? How does it help sound localization? Methods for Measuring Sound Location Accuracy 49 50 Methods for Measuring Sound Location Accuracy Two general experimental methods are used for measuring accuracy of sound localization Free-field presentation - sounds are presented by speakers located around the listener s head in a dark sound-proof room Headphone presentation with applied IIDs, ITDs and HRTFs. Free Field Presentation Sounds are presented by speakers located around the listener s head in a dark room Listener indicates location by pointing or by giving azimuth and elevation coordinates Advantage: Highly naturalistic stimulus properties. Disadvantage: Expensive equipment, some lack of control over exact sound contents 51 52

Headphone Presentation of Sounds Advantage: Experimenter has precise control over sounds. Also, much cheaper. Disadvantage: Cues from the pinna (HRTF) are eliminated, which results in the sound being internalized Sound can be externalized by measuring each subject s HTRF and applying it to the presented sounds. Questions What are the two main methods of testing localization accuracy? What are their advantages and disadvantages? Which binaural cue supersedes the others for low frequency sounds? What did Hoffman s experiment on pinna shape show? 53 54 Physiological Representation of Auditory Space Interaural time-difference detectors: neurones that respond to specific ITDs Found in auditory cortex and at the first nucleus (superior olivary) in the system that receives input from both ears Topographic map: Neural structure that responds to locations in space (not to be confused with tonotopic) Topographic Maps Barn owls have neurones in the mesencephalicus lateralus dorsalis (MLD) that respond to locations in space Mammals have similar maps in the subcortical structures, such as the inferior colliculus These neurones have receptive fields for sound location 55 56

The Auditory Cortex The receptive field for location (the rectangles) of three topographic neurones in the owl s MLD. Even though there are topographic maps in subcortical areas of mammals, there is no evidence of such maps in the cortex Instead, panoramic neurones have been found that signal location by their pattern of firing In general, the where stream shows more specific neural responses for location the further upstream one goes in the cortex 57 58 Panoramic Neurones Questions Panoramic neurones fire different patterns of bursts depending on the direction from which sound is coming What does it mean to say that some animals have a topographic map of sound locations in their cortices? What is a panoramic neurone? 59 60

Auditory Scene Analysis Auditory Scene: Array of all sound sources in a listener s environment Auditory Scene Analysis: Process by which sound sources in the auditory scene are separated into individual perceptions Identifying Sound Sources Does not happen at cochlea: Simultaneous sounds are combined together in the pattern of vibration of the basilar membrane 61 62 Auditory Scene Analysis Aspects of Auditory Scene Analysis Segregation of sound signal into those coming from individual sources. Grouping of separate sounds into those coming from a given source Sound localization Sounds from different sources are mixed together on the cochlea. Therefore, auditory scene analysis must take place entirely at later stages of audition. 63 64

Fundamental Concept: Gestalt Heuristics Recall that a heuristic is a rule of thumb used by a system (e.g., the brain) A gestalt heuristic is such a rule applied to organizing sensory inputs to determine (e.g.): Which elements of a scene belong to which objects Which elements represent edges between an object and its background Fundamental Concept: Gestalt Heuristics Many gestalt heuristics have been proposed Examples include the proximity principle and the similarity principle Obviously, these are only partially defined 65 66 Principles of Auditory Grouping Like visual stimuli, sound stimuli tend to be perceptually organized according to the following heuristics: Proximity: Sounds from a single source tend to come from one location Similarity: Sounds from a single source tend to be similar (in pitch, loudness, timbre, etc.) Smoothness: Sounds from a single source tend to change (location, pitch, loudness, etc.) in a smooth and continuous way Etc. Etc. Etc Auditory Streaming Demonstration of auditory scene analysis When sounds A and B are similar in pitch, they are grouped together in a single galloping sound stream based on proximity in time and timbre. When pitches A and B are different, they are grouped into two streams, one for the low-pitched A sounds, and another for the high-pitched B sounds. 67 68

1. Alternating high and low tones are perceived to be part of a single stream of sound due to relatively similar pitch and close spacing in time. II. Alternating high and low tones are perceived to be part of two separate streams of sound due to relatively dissimilar pitch (changing the speed would have similar effects) Experiment by Deutsch Stimuli were two sequences alternating between the right and left ears Listeners perceive two smooth sequences by grouping the sounds by similarity in pitch Melodic Channeling The proximity and smoothness heuristics win out over reality in this case! 69 70 Deutch s Melodic Channeling Experiment http://tinyurl.com/6pc66gr Questions Right Ear Left Ear a) Stimuli presented to listener s left ear (blue) and right ear (red). N.B.: notes presented to each ear jump up and down. (b) What listener hears. Although the notes in each ear jump up and down, listener perceived smooth sequence of notes. Name some Gestalt principles that apply to auditory scene analysis. Name some characteristics of sound by which sounds from the environment are grouped. 71 72

Wessel s Timbre Illusion Wessel s Timbre Illusion http://tinyurl.com/7vfy2eq Demonstrates how heuristics can compete with each other Smoothness heuristic tries to group by smoothly changing pitch. Dominates at low speeds. Similarity heuristic tries to group by timbre. Dominates a high speeds. 73 74 Auditory Grouping Proximity in time - sounds that occur in rapid succession usually come from the same source This principle was illustrated in auditory streaming Good continuation - sounds that stay constant or change smoothly are usually from the same source Good Continuation Heuristic Demonstration by Bregman et al. Tones were presented interrupted by gaps of silence or by noise In the silence condition, listeners perceived that the sound stopped during the gaps In the noise condition, the perception was that the sound continued behind the noise 75 76

Good Continuity Heuristic Example 1: Sound rises and falls, with gaps. Gaps are perceived. Example 1I: Sound rises and falls, but gaps filled with noise. Sound seems continuous. 77 Auditory Grouping: Effects of Experience Heuristics can be affected by learning and knowledge Experiment by Dowling Used two interleaved melodies ( Three Blind Mice and Mary Had a Little Lamb ) Listeners reported hearing a meaningless jumble of notes But listeners who were told to listen for the melodies were able to hear them by using melody schema 78 Schema-driven Grouping Three Blind Mice Mary Had a Little Lamb Mash-up of above http://tinyurl.com/7sbgugm Interactions Between Vision and Sound Visual capture or the ventriloquist effect - an observer perceives the sound as coming from the seen location rather than the source for the sound Experiment by Sekuler et al. Balls moving without sound appeared to move past each other Balls with an added click appeared to collide 79 80

81 82 Questions Where are you most likely to experience indirect sound? What does Sekular s experiment with the bouncing/non-bouncing balls demonstrate? 83