Language Speech. Speech is the preferred modality for language.

Similar documents
Outline.! Neural representation of speech sounds. " Basic intro " Sounds and categories " How do we perceive sounds? " Is speech sounds special?

Does Wernicke's Aphasia necessitate pure word deafness? Or the other way around? Or can they be independent? Or is that completely uncertain yet?

MULTI-CHANNEL COMMUNICATION

ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION

group by pitch: similar frequencies tend to be grouped together - attributed to a common source.

Auditory-Visual Integration of Sine-Wave Speech. A Senior Honors Thesis

Takwa Adly Gabr Assistant lecturer of Audiology

Consonant Perception test

Human cogition. Human Cognition. Optical Illusions. Human cognition. Optical Illusions. Optical Illusions

Excellent Network Courses. Department of Neurology Affiliated hospital of Jiangsu University

2/25/2013. Context Effect on Suprasegmental Cues. Supresegmental Cues. Pitch Contour Identification (PCI) Context Effect with Cochlear Implants

Fundamentals of Cognitive Psychology, 3e by Ronald T. Kellogg Chapter 2. Multiple Choice

SPEECH ANALYSIS 3/3 PERCEPTION

Psy /16 Human Communication. By Joseline

What is sound? Range of Human Hearing. Sound Waveforms. Speech Acoustics 5/14/2016. The Ear. Threshold of Hearing Weighting

FAILURES OF OBJECT RECOGNITION. Dr. Walter S. Marcantoni

Optical Illusions 4/5. Optical Illusions 2/5. Optical Illusions 5/5 Optical Illusions 1/5. Reading. Reading. Fang Chen Spring 2004

Why does language set up shop where it does?

THE ROLE OF VISUAL SPEECH CUES IN THE AUDITORY PERCEPTION OF SYNTHETIC STIMULI BY CHILDREN USING A COCHLEAR IMPLANT AND CHILDREN WITH NORMAL HEARING

Sensorimotor Functioning. Sensory and Motor Systems. Functional Anatomy of Brain- Behavioral Relationships

Hearing. istockphoto/thinkstock

Introduction to Physiological Psychology Review

Prelude Envelope and temporal fine. What's all the fuss? Modulating a wave. Decomposing waveforms. The psychophysics of cochlear

Chapter 11: Sound, The Auditory System, and Pitch Perception

Open Access Neural Dynamics of Audiovisual Integration for Speech and Non-Speech Stimuli: A Psychophysical Study

ID# Exam 2 PS 325, Fall 2003

Sound Waves. Sensation and Perception. Sound Waves. Sound Waves. Sound Waves

The role of periodicity in the perception of masked speech with simulated and real cochlear implants

11 Music and Speech Perception

Higher Cortical Function

The neural basis of sign language processing

THE EAR Dr. Lily V. Hughes, Audiologist

Learning Objectives.

5. Which word refers to making

Speech (Sound) Processing

Topic 4. Pitch & Frequency

Gick et al.: JASA Express Letters DOI: / Published Online 17 March 2008

ID# Final Exam PS325, Fall 1997

HEARING. Structure and Function

The Deaf Brain. Bencie Woll Deafness Cognition and Language Research Centre

MECHANISM OF HEARING

to vibrate the fluid. The ossicles amplify the pressure. The surface area of the oval window is

Auditory-Visual Speech Perception Laboratory

Perceptual Disorders. Agnosias

Speech Reading Training and Audio-Visual Integration in a Child with Autism Spectrum Disorder

Temporal Location of Perceptual Cues for Cantonese Tone Identification

Hearing Lectures. Acoustics of Speech and Hearing. Auditory Lighthouse. Facts about Timbre. Analysis of Complex Sounds

Title of Thesis. Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential

Systems Neuroscience Oct. 16, Auditory system. http:

AUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening

Running head: HEARING-AIDS INDUCE PLASTICITY IN THE AUDITORY SYSTEM 1

Lecture Outline. The GIN test and some clinical applications. Introduction. Temporal processing. Gap detection. Temporal resolution and discrimination

REFERRAL AND DIAGNOSTIC EVALUATION OF HEARING ACUITY. Better Hearing Philippines Inc.

A study of the effect of auditory prime type on emotional facial expression recognition

Psychology Formative Assessment #2 Answer Key

A Consumer-friendly Recap of the HLAA 2018 Research Symposium: Listening in Noise Webinar

Spectrograms (revisited)

Why does language set up shop where it does?

Hearing Aids. Bernycia Askew

Linguistic Phonetics Fall 2005

A Senior Honors Thesis. Brandie Andrews

An Update on Auditory Neuropathy Spectrum Disorder in Children

Speech Spectra and Spectrograms

HCS 7367 Speech Perception

Hearing. PSYCHOLOGY (8th Edition, in Modules) David Myers. Module 14. Hearing. Hearing

Assessing Hearing and Speech Recognition

Clinical and Experimental Neuropsychology. Lecture 3: Disorders of Perception

Audiovisual speech perception in children with autism spectrum disorders and typical controls

SLHS 1301 The Physics and Biology of Spoken Language. Practice Exam 2. b) 2 32

HCS 7367 Speech Perception

Topic 4. Pitch & Frequency. (Some slides are adapted from Zhiyao Duan s course slides on Computer Audition and Its Applications in Music)

Acoustics, signals & systems for audiology. Psychoacoustics of hearing impairment

How Can The Neural Encoding and Perception of Speech Be Improved?

Hearing Screening, Diagnostics and Intervention

COM3502/4502/6502 SPEECH PROCESSING

MODALITY, PERCEPTUAL ENCODING SPEED, AND TIME-COURSE OF PHONETIC INFORMATION

Topics in Linguistic Theory: Laboratory Phonology Spring 2007

SYLLABUS FOR PH.D ENTRANCE TEST IN SPEECH AND HEARING

the human 1 of 3 Lecture 6 chapter 1 Remember to start on your paper prototyping

EVALUATION OF SPEECH PERCEPTION IN PATIENTS WITH SKI SLOPE HEARING LOSS USING ARABIC CONSTANT SPEECH DISCRIMINATION LISTS

Overview. Each session is presented with: Text and graphics to summarize the main points Narration, video and audio Reference documents Quiz

Categorical Perception

ACOUSTIC ANALYSIS AND PERCEPTION OF CANTONESE VOWELS PRODUCED BY PROFOUNDLY HEARING IMPAIRED ADOLESCENTS

Unit VIII Problem 9 Physiology: Hearing

A PROPOSED MODEL OF SPEECH PERCEPTION SCORES IN CHILDREN WITH IMPAIRED HEARING

The Production of Speech Speed of Sound Air (at sea level, 20 C) 343 m/s (1230 km/h) V=( T) m/s

FREQUENCY COMPRESSION AND FREQUENCY SHIFTING FOR THE HEARING IMPAIRED

Audiology Curriculum Foundation Course Linkages

SPHSC 462 HEARING DEVELOPMENT. Overview Review of Hearing Science Introduction

Linguistic Phonetics. Basic Audition. Diagram of the inner ear removed due to copyright restrictions.

Hearing in the Environment

Neuropsychologia 50 (2012) Contents lists available at SciVerse ScienceDirect. Neuropsychologia

Essential feature. Who are cochlear implants for? People with little or no hearing. substitute for faulty or missing inner hair

Cortical sound processing in children with autism disorder: an MEG investigation

Music and Hearing in the Older Population: an Audiologist's Perspective

Hearing and Balance 1

Adapting Patient Provider. with Communication Disorders

ID# Exam 2 PS 325, Fall 2009

ACTIVATION IN AUDITORY CORTEX BY SPEECHREADING IN HEARING PEOPLE: fmri STUDIES

Cortical Auditory Evoked Potentials in Children with Developmental Dysphasia

Transcription:

Language Speech Speech is the preferred modality for language.

Outer ear Collects sound waves. The configuration of the outer ear serves to amplify sound, particularly at 2000-5000 Hz, a frequency range that is important for speech. Middle ear Transforms the energy of a sound wave into the internal vibrations of the bone structure of the middle ear and transforms these vibrations into a compressional wave in the inner ear. Inner ear Transform the energy of a compressional wave within the inner ear fluid into nerve impulses which can be transmitted to the brain.

Auditory input reaches primary auditory cortex about 10 15 msec after stimulus onset (Liegeois-Chauvel, Musolino, & Chauvel, 1991; Celesia, 1976). http://www.sis.pitt.edu/~is1042/html/audtrans.gif

Primary auditory cortex (A1): Brodmann areas 41 and 42. Auditory-association cortex (A2): Brodmann area 22. http://ist-socrates.berkeley.edu/ ~jmp/lo2/6.html

Questions Is speech special? Is there brain activity that is specifically sensitive to the properties of speech (as opposed to those of other sounds)? Can brain damage lead to a selective loss of speech perception? Are the mechanisms of speech perception different from the mechanisms of auditory perception in general? Role of audio-visual integration

An auditory evoked neuromagnetic field M100 field pattern Roberts, Ferrari, Stufflebeam & Poeppel, 2000, J Clin Neurophysiol, Vol 17, No 2, 2000

M100 (N1 in ERP terms) Intensity higher intensity shorter latency, larger amplitude Frequency higher frequency shorter latency Phonetic identity e.g., shorter latencies for a than for u (explainable in spectral terms) Function of M100 activity in speech processing unclear.

M100 (N1 in ERP terms) (Roberts & Poeppel, Neuroreport, 1996)

M100 (N1 in ERP terms) (Poeppel et al., Neuroscience Letters, 1996)

Questions Is speech special? Is there brain activity that is specifically sensitive to the properties of speech (as opposed to those of other sounds)? Unclear. Can brain damage lead to a selective loss of speech perception? Are the mechanisms of speech perception different from the mechanisms of auditory perception in general? Role of audio-visual integration

Questions Is speech special? Is there brain activity that is specifically sensitive to the properties of speech (as opposed to those of other sounds)? Unclear. Can brain damage lead to a selective loss of speech perception? Are the mechanisms of speech perception different from the mechanisms of auditory perception in general? Role of audio-visual integration

Disorders of auditory processing Rare. Signal from each ear is processed in both hemispheres (contra visual system). Bilateral damage often necessary. Generally requires two separate neurological events.

Disorders of auditory processing: CORTICAL DEAFNESS Inability to hear sounds without apparent damage to the hearing apparatus or brain stem abnormality. Extensive bilateral damage to auditory cortex (BAs 41 & 42).

Disorders of auditory processing: AUDITORY AGNOSIA Inability to recognise auditorily presented sounds (e.g., coughing, crying) independent of any deficit in processing spoken language. Damage in auditory association cortex (BAs 22 & 37)

Disorders of auditory processing: AMUSIA (a subvariaty of AUDITORY AGNOSIA) Impaired in tasks requiring pattern recognition in music. Relative sparing of speech and (other) non-speech perception. Damage generally in right temporal areas.

Disorders of auditory processing: PURE WORD DEAFNESS Inability to understand spoken words while auditory perception is otherwise intact & other linguistic skills are intact (reading, speech production)

Disorders of auditory processing: PURE WORD DEAFNESS Either Bilateral damage to auditory cortex Or A subcortical lesion in the left hemisphere that severs both ipsilateral and contralateral projections to Wernicke s area (Ziegler,1952; Geschwind, 1965; Coslet et al., 1984; Jones and Dinolt, 1952).

Disorders of auditory processing: PHONAGNOSIA Impairment in the ability to recognise familiar voices. Speech comprehension is intact. Intact ability to identify nonverbal sounds.

Disorders of auditory processing: PHONAGNOSIA Van Lancker et al., 1988: Double dissociation between memory for familiar voices and the ability to discriminate between unfamiliar voices. Performance on a discrimination task was impaired by a lesion to either temporal lobe. Performance on the famous voices task was impaired by lesions to the right parietal lobe.

Questions Is speech special? Is there brain activity that is specifically sensitive to the properties of speech (as opposed to those of other sounds)? Unclear. Can brain damage lead to a selective loss of speech perception? YES, but no one-to-one correspondence between lesion site and type of disorder. E.g., bilateral temporal damage can lead either to cortical deafness or pure word deafness. Are the mechanisms of speech perception different from the mechanisms of auditory perception in general? Role of audio-visual integration

Bilaterality of speech processing Poeppel (2001), Poeppel & Hickok (2000): Speech is fundamentally bilateral. But LH and RH have different temporal integration windows. LH processes fast phenomena, 25-50ms Acoustic changes in frequency that allow phoneme perception. RH processes slow aspects of speech, 150-200ms. Syllable level phenomena. Intonation. Prosody.

Questions Is speech special? Is there brain activity that is specifically sensitive to the properties of speech (as opposed to those of other sounds)? Unclear. Can brain damage lead to a selective loss of speech perception? YES, but no one-to-one correspondence between lesion site and type of disorder. E.g., bilateral temporal damage can lead either to cortical deafness or pure word deafness. Are the mechanisms of speech perception different from the mechanisms of auditory perception in general? Role of audio-visual integration

What makes speech sound like speech?

Sine-wave speech A spectrogram, or an acoustic "picture" of a speech utterance. Time is represented on the horizontal axis, frequency on the vertical axis. Amplitude corresponds to the darkness. A sinewave replica of the natural above. All finegrain acoustic properties of speech are discarded and only the coarse-grain changes in the spectra over time are retained.

Sine-wave speech Remez et al. (1981): Naïve subjects fail to perceive sine-wave stimuli as speech. However, if subjects are instructed about the speech-like nature of sinewave speech, they can easily understand it. Argument for a special speech mode of processing.

Sine-wave speech Sentence 1 Sentence 2 Sentence 3 Sentence 4 Sentence 5 Sentence 6 SWS SWS SWS SWS SWS SWS Original Original Original Original Original Original

Role of articulatory gestures on speech perception Recognition of sine-wave speech or speech that has been degraded by noise is improved if the stimulus is accompanied by congruent articulatory gestures.

McGurk effect

Role of articulatory gestures on speech perception Incongruent articulatory gestures can change the auditory percept even when the signal is clear.

Is audio-visual integration specific to speech? Saldana and Rosenblum (1993): No. Audio visual integration of the plucks and bows of cello playing. Not only speech, but also other ecologically valid combinations of auditory and visual stimuli can integrate in a complex manner.

Question Is speech is perceived as all other sounds or is there a specialized mechanism responsible for coding the acoustic signal into phonetic segments?

Method, experiment 1 1. Training in non-speech mode: Subjects were first trained to distinguish between Stimulus 1 and 2, which were the sinewave replicas of the Finnish nonwords /omso/ and /onso/. 2. Non-speech test: SWS tokens were presented alone or audiovisually with a congruent or incongruent visual articulation. Subjects were instructed to indicate by a button press whether they heard stimulus 1 or 2. 3. Natural speech test: As in 2, but now the auditory stimuli consisted of natural tokens of /onso/ and /omso/. Subjects were told to indicate whether the consonant they heard was /n/, /m/ or something else. 4. Training in speech mode: Subjects were taught to categorize the SWS stimuli as /omso/ and /onso/. 5. Test in speech mode: Indicate whether the consonant they heard was /n/, /m/ or something else.

Results

Experiment 2: Speech mode training and test occur first

Questions Is speech special? Is there brain activity that is specifically sensitive to the properties of speech (as opposed to those of other sounds)? Unclear. Can brain damage lead to a selective loss of speech perception? YES, but no one-to-one correspondence between lesion site and type of disorder. Are the mechanisms of speech perception different from the mechanisms of auditory perception in general? Role of audio-visual integration may be special for speech.