THE ROLE OF VISUAL SPEECH CUES IN THE AUDITORY PERCEPTION OF SYNTHETIC STIMULI BY CHILDREN USING A COCHLEAR IMPLANT AND CHILDREN WITH NORMAL HEARING
|
|
- Francis McKinney
- 6 years ago
- Views:
Transcription
1 THE ROLE OF VISUAL SPEECH CUES IN THE AUDITORY PERCEPTION OF SYNTHETIC STIMULI BY CHILDREN USING A COCHLEAR IMPLANT AND CHILDREN WITH NORMAL HEARING Vanessa Surowiecki 1, vid Grayden 1, Richard Dowell 2, Graeme Clark 1, 2 & Paul Maruff 3 1 The Bionic Ear Institute, East Melbourne 2 Dept. of Otolaryngology, The University of Melbourne 3 Dept. of Psychology, LaTrobe University ABSTRACT: Debate ensues as to the benefits of visual speech cues on auditory perceptions in children using a cochlear implant. Ten children using a cochlear implant were matched by age, gender and Performance IQ to children with normal hearing. Participants responded as to whether they perceived /ba/, /da/ or /ga/ when presented with nine synthetic voiced plosives presented in isolation or in synchrony with visual articulations of /ba/ and /ga/. Comparison between the two test conditions revealed that the children s auditory perceptions were influenced by visual speech cues. Visual /ga/ paired with the auditory stimuli resulted in fewer auditory /ba/ percepts and increased auditory /da/ percepts for both groups. Congruent visual /ga/ cues did not improve auditory perception of /ga/ stimuli. Visual /ba/ presented in synchrony with the nine stimuli resulted in increased /ba/ percepts for stimuli 3 9 and fewer /da/ and /ga/ percepts. Auditory perception of /ba/ was improved under the congruent visual /ba/ condition. The implanted group s apparent response bias towards visual /ba/ articulations for /da/ and /ga/ auditory stimuli could be attributed to the evident uncertainty in which they labelled these stimuli in the auditory alone condition. These results suggest that all children, in particular those using a cochlear implant are likely to benefit from seeing a talker s mouth when listening in situations where the auditory signal is ambiguous. INTRODUCTION Following cochlear implantation, hearing-impaired children are provided with a range of habilitation services aimed at developing their auditory perceptual skills. Habilitation specialists disagree as to whether visual speech cues aid or hinder the perception of auditory speech sounds. This paper examines the influence of visual speech cues on implanted children s auditory perceptions of synthetic voiced plosives, comparing their performance to children with normal hearing. Prior to implantation, children with a profound hearing loss rely on visual speech cues to supplement the limited auditory stimuli they receive via hearing aids. This has led some researchers to suggest that individuals with hearing loss may develop a response bias towards visual speech cues (Walden, Montgomery, et al., 199). Advocates of Auditory-Verbal therapy propose that a bias towards visual speech cues will detract from the child s ability to utilise the new speech sounds presented via the implant (Vaughan, 1981). Auditory-Verbal therapists recommend parents do not face their child when speaking, minimising the child s use of visual speech cues (Beebe, Pearson, & Koch, 1984). However, regardless of hearing status all individuals use visual speech cues to complement auditory processing (Green & Kuhl, 1989; McGurk & MacDonald, 1976). In particular, visual articulations are known to increase auditory perception of speech when the auditory signal is degraded, when there is a poor signal to noise ratio or when the individual is listening monaurally (Arnsten, 1998; Erber, 1969; Grant, Ardell, et al., 1985; Pichora-Fuller, 1996; Sumby & Pollack, 1954). Despite the limited visual cues for many articulations, individuals perceive sounds more consistently when presented with congruent visual speech cues. Parents must choose which communication style they believe will assist their child to acquire auditory perceptual skills. This major decision impacts on the skills the child will possess throughout their education and will influence their prospects as an adult. To determine the influence of visual speech cues on the auditory perceptions of children using a cochlear implant, visual articulations were paired with a continuum of synthetic voiced plosives. Using a combination of a categorical perception task and the McGurk test (McGurk & MacDonald, 1976), the study paired nine auditory stimuli with visual articulations of /ba/ and /ga/ (Walden, Montgomery, et al., 199). Transition feature cues were provided in the /ba-da-ga/ continuum as the temporal aspects of the cues pair well with temporal changes in visual articulations (Green & Norrix, 1997). Previous reports indicate that the influence of visual articulations on auditory perceptions may relate to listening experience (Hockley & Polka, 1994; Massaro, 1984; McGurk & MacDonald, 1976). Children with Melbourne, December 2 to 5, 2. Australian Speech Science & Technology Association Inc. Accepted after abstract review page 433
2 normal hearing were included as a comparison group to determine whether the implanted children s responses related to chronological age. Specifically, the study anticipated the following; The auditory percepts of both groups of children will change when presented in synchrony with visual articulations. Auditory perceptions of speech sounds will improve when presented in synchrony with congruent visual articulations. METHOD Subjects Ten children using a Nucleus multiple channel cochlear implant were matched by age, gender and Performance IQ (Wechsler, 1991) to children with normal hearing. Implanted children were identified through the Royal Victorian Eye and Ear Hospitals Cochlear Implant Clinics patients list whereas control participants were identified through state and Catholic primary schools. Tested with a screening audiogram, the control group possessed pure tone averages at db HL or better at the frequencies of 5Hz, 1kHz and 2kHz. The sample mean age was 8 years, 6 months (SD = 1 year, 4 months) and the mean Performance IQ for the group was 11.8 (SD = 8.95). The mean age at cochlear implantation was 4 years, 2 months (SD = 2 years, 4 months) and the children had on average 4 years, 4 months (SD = 2 years, 5 months) experience listening with the device. Stimulus Materials The synthetic stimuli were created using values obtained from natural recordings of a female Australian speaker. Nine stimuli were created varying in spectral transition cues along a /ba-da-ga/ continuum. The fundamental frequency began at 4Hz and decreased linearly to 156Hz across the 65ms. The first formant onset was at 3Hz for the consonant portion of the stimuli and voice onset was set at ms. The second formant began at 1.2kHz for stimulus 1, increased in 25Hz steps to stimulus 5 then increased in Hz steps to stimulus 9. The third formant value for stimulus 1 was 2.7kHz, increasing in 75Hz steps to 3kHz for stimulus 5 then decreasing in 75Hz steps to 2.7kHz for stimulus 9. Stimuli 2, 5 and 8 were the exemplar tokens for /ba/, /da/ and /ga/ respectively. Values for /a/ were 1kHz for F1, 1.5kHz for F2, 2.7kHz for F3. The fourth formant was kept at 4kHz for the entire syllable duration. The stimuli set were rms normalised. Digital video recordings were made of a female articulating the sounds /ba/, /da/ and /ga/. The female was shown from the neck to under her eyes. Audio-visual files were created by pairing each of the auditory stimuli with the visual articulations of /ba/ and /ga/. Procedure The children were assessed individually in a quiet room either within their home or at school. A Toshiba 47CDT notebook was used to run the activities with the response options and video files presented on a 15 ELO Intellitouch touch sensitive monitor. The child sat facing the monitor throughout testing. A self-amplified Wharfedale speaker was placed on a tripod one metre in front of the hearing impaired child s implanted ear, and on either side for children with normal hearing. The speaker had a broad frequency response from Hz to 15kHz. The speaker height was adjusted to each child's ear level. The sound level was calibrated to 68dBSPL at the child's ear level using a sound level meter (Quest Electronics). During the initial one-hour test session of a larger study, the child completed four activities. For each activity, the response options, ba, da, and ga were presented as buttons on the monitor. For all four tasks the child responded by selecting one of the buttons to indicate the sound they perceived. An auditory-alone practice trial was included to allow the child to become familiar with the exemplar stimuli. Following this trial was an auditory-alone test consisting of each of the nine stimuli presented ten times in random order. The children were then given a visual-alone trial to allow them practice in recognising the articulations. The final activity was the audio-visual test in which the children observed the video recordings of /ba/ and /ga/ presented in synchrony with each of the nine stimuli, each combination presented ten times. This activity was divided into two blocks of 9 audio-visual stimuli and a rest period was allowed in between. Results from the auditory-alone and the audio-visual test were recorded. Melbourne, December 2 to 5, 2. Australian Speech Science & Technology Association Inc. Accepted after abstract review page 434
3 RESULTS Independent samples t-tests revealed no significant differences between the two groups for age, t (18) =.15, p =.989, or Performance IQ, t (18) =.243, p =.81. All children completed the auditory-alone (AA) and audio-visual testing (AV). Results of the AA testing are displayed in Figures 1 and 2. The children s responses to the AV visual /ga/ articulations are shown in Figures 3 and 4, and responses to visual /ba/ are shown in Figures 5 and 6. Results from the AA task indicated that the control group perceived stimuli 1 and 2 as /ba/, stimuli 4-6 as /da/ and stimuli 8 and 9 as /ga/. Children using a cochlear implant were able to consistently perceive /ba/ for stimuli 1 and 2 under the AA condition. The implanted group s perception of /da/ and /ga/ was less consistent with all responses lower than 5% for the nine stimuli. Figure 1. AA condition: Average responses made by control group. Figure 2. AA condition: Average responses made by implant group. Figure 3. AV condition with Visual /ga/: Average responses made by control group. Significant Paired Samples results in red. Figure 4. AV condition with Visual /ga/: Average responses made by implant group. Significant Paired Samples results in red Figure 5: AV condition with Visual /ba/: Average responses made by control group. Significant Paired Samples results in red. Significant Wilcoxon Signed Ranks results in green. Figure 6. AV condition with Visual /ba/: Average responses made by implant group. Significant Paired Samples results in red. Significant Wilcoxon Signed Ranks results in green. Melbourne, December 2 to 5, 2. Australian Speech Science & Technology Association Inc. Accepted after abstract review page 435
4 Surowiecki et al. Visual biasing of auditory percepts by children To compare mean group performance for the AA and AV conditions, paired samples t-tests were used for parametric variables and Wilcoxon Signed Ranks tests were conducted on non-parametric variables. Audio-visual responses that significantly differed from the AA condition at a level of.5 or lower are marked in Figures 3 6. As seen in Figure 2, when provided with visual /ga/ articulations, the mean number of /ba/ responses made by the control group for stimuli 1 3 decreased whilst their average number of /da/ responses increased for stimuli 2 and 3 and decreased for stimulus 4. Under this visual /ga/ condition, the control group s /ga/ responses to stimulus 9 decreased. When presented with visual /ga/ articulations, children using a cochlear implant perceived fewer /ba/ stimuli when listening to stimuli 1 4 and 8, and more /da/ syllables when presented with stimulus 1. The implanted group s results are displayed in Figure 4. Presented in Figure 5, the control group s auditory perception of /ba/, /da/ and /ga/ was altered by the simultaneous presentation of visual /ba/ articulations. When presented with visual /ba/, children with normal hearing perceived /ba/ more often for stimuli 3 9, resulting in significantly fewer /da/ responses for these same stimuli. The mean number of /ga/ responses also decreased for stimuli 8 and 9. Under the same visual /ba/ condition, the implanted children s auditory perceptions of stimuli 3 9 were also altered. Children using a cochlear implant perceived /ba/ for all nine stimuli in this AV condition, with a minimum /ba/ response rate of 47%. The mean number of /da/ responses decreased for stimuli 3 9, as did /ga/ responses for stimuli 3, 4, 6, and 7. DISCUSSION As anticipated, all participants were influenced by visual speech cues as evidenced by the different responses to the nine stimuli in the audio-visual (AV) conditions. Compared to the auditory-alone (AA) condition, children in the control group perceived fewer /ba/ percepts when stimuli 1 3 were paired with visual /ga/ articulations. Despite the fact that these children consistently labelled these stimuli as /ba/ in the AA condition, and despite the limited visual cues available in a /ga/ articulation, the visual /ga/ presentation altered the children s perceptions of the stimuli. The number of /da/ percepts increased for the control group under the visual /ga/ condition, suggesting the articulations assisted the children to perceive /da/ qualities in the transition-only stimuli. Congruent visual /ga/ cues did not increase the control group s auditory perception of stimuli 4 9. Performance actually decreased under the visual /ga/ condition with fewer /da/ percepts for stimuli 4 and fewer /ga/ responses to stimuli 9. However, the overall shape of the control group s responses remained similar to the AA condition. This suggests that while the children were influenced by the visual articulations of /ga/, they were able to utilise the transition cues made available to label the stimuli as /ba/, /da/, or /ga/. Children using a cochlear implant were also influenced by visual /ga/ articulations. In particular the children responded less often that they perceived /ba/ for stimuli 1 4 under the visual /ga/ condition. The group s mean number of /da/ responses increased for stimulus 1. However, the percentage of /da/ responses for this condition remained below chance level. No difference was identified between the number of /ga/ responses made by the implanted group under the AA and AV visual /ga/ conditions. The implanted groups response rate to the /da/ and /ga/ syllables remained low across the nine stimuli with a maximum response rate of 53%. Combined, these results suggest that the presentation of congruent /ga/ articulations did not assist either group in improving their auditory perceptions of the stimuli. Under the visual /ba/ condition, the control group perceived stimulus 3 as /ba/, whereas under the AA condition they were uncertain of the sound. This suggests that when presented with an ambiguous speech sound, such as stimulus 3, children with normal hearing use congruent visual speech cues to influence their judgement of the sound. Under the visual /ba/ condition, children with normal hearing perceived fewer /da/ syllables for stimuli 3 5 and fewer /ga/ syllables for stimuli 8 and 9. Previous reports indicate that auditory perception of /da/ and /ga/ is greatly assisted by the presence of burst cues (Green & Norrix, 1997). As the present set of stimuli contained only transition cues, it is possible that the control group perceived ambiguity in the auditory signal for /da/, resulting in greater reliance on visual speech cues to supplement their auditory perception of these sounds (Massaro, 1984). However, whilst the control group responded with fewer /da/ and /ga/ responses under this AV condition, the shape of their /da/ and /ga/ responses remained comparative with auditory-alone responses. In contrast, when presented with visual /ba/, children using a cochlear implant consistently perceived the syllable /ba/ for all nine stimuli. Under the visual /ba/ condition, the implant group perceived more /ba/ qualities in stimuli 3 and 4, suggesting that the congruent visual speech cues assisted the children in their perception of these blended sounds. Their remaining responses to stimuli 5 9 suggest that Melbourne, December 2 to 5, 2. Australian Speech Science & Technology Association Inc. Accepted after abstract review page 436
5 Surowiecki et al. Visual biasing of auditory percepts by children the children were strongly influenced by the visual /ba/ articulations as all /da/ and /ga/ responses reduced to below chance level. This could be attributed to the hearing-impaired group s lower responses to /da/ and /ga/ in the auditory-alone condition, indicating a difficulty to perceive the sounds as either syllable. Ambiguity in the auditory signal may result in increased reliance on the strong visual /ba/ articulations to create a percept of the stimuli (Massaro, 1984). Further investigation is warranted to determine the reason behind the children s responses. Ambiguity in the signal could be attributed to the synthetic tokens used and the limited speech features made available to the children. A pilot study with post-lingually hearing-impaired adult cochlear implant users suggested that /da/ was perceived at a response rate of 52% for stimulus 5 and /ga/ was perceived at 67% for stimuli 8 and 9. Combined, these results suggest that the speech processor may not encode and present the synthetic stimuli to the auditory nerve with sufficient features to create a /da/ and /ga/ percept. Comparison between the current sample of implanted children and language experience matched children could determine the extent to which listening experience contributed to the findings. Overall these findings suggest that strong visual articulations such as /ba/ assist children to perceive speech sounds, in particular when those sounds are ambiguous. In the current study, ambiguity was introduced by creating synthetic stimuli containing transition cues. In everyday listening situations, ambiguity is added to the auditory signal when listening to a degraded signal, when listening to speech presented in noise, or when listening monaurally. In such situations, all children, in particular those using a cochlear implant, are likely to benefit from seeing the talker s articulations. REFERENCES Arnsten, A. F. T. (1998). Catecholamine modulation of prefrontal cortical cognitive function. Trends in Cognitive Sciences, 2(11): Beebe, H.H., Pearson, H.R., & Koch, M.E. (1984). The Helen Beebe Speech and Hearing Center. In D. Ling (Ed). Early intervention for hearing-impaired children: Oral options. College-Hill Press: San Diego, pp Erber, N. P. (1969). Interaction of audition and vision in the recognition of oral speech stimuli. Journal of Speech & Hearing Research, 12: Grant, K. W., Ardell, L. H., et al. (1985). The contribution of fundamental frequency, amplitude envelope, and voicing duration cues to speech-reading in normal-hearing subjects. Journal of the Acoustical Society of America, 77: 67. Green, K. & Norrix, L. (1997). Acoustic cues to place of articulation and the McGurk effect: The role of release bursts, aspiration and formant transitions. Journal of Speech, Language, and Hearing Research, (3): Green, K. P. & Kuhl, P. K. (1989). The role of visual information in the processing of place and manner features in speech perception. Perception & Psychophysics. 45(1): Hockley, S. N. & Polka, L. (1994). A developmental study of audio-visual speech perception using the McGurk paradigm. Journal of the Acoustical Society of America, 96: 339. Massaro, D. W. (1984). Children's perception of visual and auditory speech. Child Development, 55: McGurk, H. & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264: Pichora-Fuller, M. K. (1996). Working memory and speechreading. Speechreading by humans and machines: Models, systems, and applications. D. G. Stork, & M.E. Hennecke, Springer-Verlag: Berlin, pp Sumby, W. G. & Pollack, I. (1954). Visual contributions to speech intelligibility in noise. Journal of the Acoustical Society of America, 26: Vaughan, P. (Ed). (1981). Learning to listen: A book by mothers for mothers of hearing-impaired children. Beaufort Books: Toronto. Walden, B., Montgomery, A., et al. (199). Visual biasing of normal and impaired auditory speech perception. Journal of Speech and Hearing Research, 33: Wechsler, D. (1991). Wechsler Intelligence Scale for Children (3 rd ed). The Psychological Corporation: Texas. Melbourne, December 2 to 5, 2. Australian Speech Science & Technology Association Inc. Accepted after abstract review page 437
Speech perception of hearing aid users versus cochlear implantees
Speech perception of hearing aid users versus cochlear implantees SYDNEY '97 OtorhinolaIYngology M. FLYNN, R. DOWELL and G. CLARK Department ofotolaryngology, The University ofmelbourne (A US) SUMMARY
More information2/25/2013. Context Effect on Suprasegmental Cues. Supresegmental Cues. Pitch Contour Identification (PCI) Context Effect with Cochlear Implants
Context Effect on Segmental and Supresegmental Cues Preceding context has been found to affect phoneme recognition Stop consonant recognition (Mann, 1980) A continuum from /da/ to /ga/ was preceded by
More informationDifferential-Rate Sound Processing for Cochlear Implants
PAGE Differential-Rate Sound Processing for Cochlear Implants David B Grayden,, Sylvia Tari,, Rodney D Hollow National ICT Australia, c/- Electrical & Electronic Engineering, The University of Melbourne
More informationHearing the Universal Language: Music and Cochlear Implants
Hearing the Universal Language: Music and Cochlear Implants Professor Hugh McDermott Deputy Director (Research) The Bionics Institute of Australia, Professorial Fellow The University of Melbourne Overview?
More informationAuditory-Visual Integration of Sine-Wave Speech. A Senior Honors Thesis
Auditory-Visual Integration of Sine-Wave Speech A Senior Honors Thesis Presented in Partial Fulfillment of the Requirements for Graduation with Distinction in Speech and Hearing Science in the Undergraduate
More informationMULTI-CHANNEL COMMUNICATION
INTRODUCTION Research on the Deaf Brain is beginning to provide a new evidence base for policy and practice in relation to intervention with deaf children. This talk outlines the multi-channel nature of
More informationConsonant Perception test
Consonant Perception test Introduction The Vowel-Consonant-Vowel (VCV) test is used in clinics to evaluate how well a listener can recognize consonants under different conditions (e.g. with and without
More informationA PROPOSED MODEL OF SPEECH PERCEPTION SCORES IN CHILDREN WITH IMPAIRED HEARING
A PROPOSED MODEL OF SPEECH PERCEPTION SCORES IN CHILDREN WITH IMPAIRED HEARING Louise Paatsch 1, Peter Blamey 1, Catherine Bow 1, Julia Sarant 2, Lois Martin 2 1 Dept. of Otolaryngology, The University
More informationSpeech Cue Weighting in Fricative Consonant Perception in Hearing Impaired Children
University of Tennessee, Knoxville Trace: Tennessee Research and Creative Exchange University of Tennessee Honors Thesis Projects University of Tennessee Honors Program 5-2014 Speech Cue Weighting in Fricative
More informationLanguage Speech. Speech is the preferred modality for language.
Language Speech Speech is the preferred modality for language. Outer ear Collects sound waves. The configuration of the outer ear serves to amplify sound, particularly at 2000-5000 Hz, a frequency range
More informationEffects of noise and filtering on the intelligibility of speech produced during simultaneous communication
Journal of Communication Disorders 37 (2004) 505 515 Effects of noise and filtering on the intelligibility of speech produced during simultaneous communication Douglas J. MacKenzie a,*, Nicholas Schiavetti
More informationCritical Review: Speech Perception and Production in Children with Cochlear Implants in Oral and Total Communication Approaches
Critical Review: Speech Perception and Production in Children with Cochlear Implants in Oral and Total Communication Approaches Leah Chalmers M.Cl.Sc (SLP) Candidate University of Western Ontario: School
More informationThe REAL Story on Spectral Resolution How Does Spectral Resolution Impact Everyday Hearing?
The REAL Story on Spectral Resolution How Does Spectral Resolution Impact Everyday Hearing? Harmony HiResolution Bionic Ear System by Advanced Bionics what it means and why it matters Choosing a cochlear
More informationBest Practice Protocols
Best Practice Protocols SoundRecover for children What is SoundRecover? SoundRecover (non-linear frequency compression) seeks to give greater audibility of high-frequency everyday sounds by compressing
More informationComputational Perception /785. Auditory Scene Analysis
Computational Perception 15-485/785 Auditory Scene Analysis A framework for auditory scene analysis Auditory scene analysis involves low and high level cues Low level acoustic cues are often result in
More informationAUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening
AUDL GS08/GAV1 Signals, systems, acoustics and the ear Pitch & Binaural listening Review 25 20 15 10 5 0-5 100 1000 10000 25 20 15 10 5 0-5 100 1000 10000 Part I: Auditory frequency selectivity Tuning
More informationAuditory-Visual Speech Perception Laboratory
Auditory-Visual Speech Perception Laboratory Research Focus: Identify perceptual processes involved in auditory-visual speech perception Determine the abilities of individual patients to carry out these
More informationImplants. Slide 1. Slide 2. Slide 3. Presentation Tips. Becoming Familiar with Cochlear. Implants
Slide 1 Program Becoming Familiar with Cochlear Implants Hello and thanks for joining us to learn more about cochlear implants. Today s presentation provides a basic overview about cochlear implants candidacy,
More informationProduction of Stop Consonants by Children with Cochlear Implants & Children with Normal Hearing. Danielle Revai University of Wisconsin - Madison
Production of Stop Consonants by Children with Cochlear Implants & Children with Normal Hearing Danielle Revai University of Wisconsin - Madison Normal Hearing (NH) Who: Individuals with no HL What: Acoustic
More informationRole of F0 differences in source segregation
Role of F0 differences in source segregation Andrew J. Oxenham Research Laboratory of Electronics, MIT and Harvard-MIT Speech and Hearing Bioscience and Technology Program Rationale Many aspects of segregation
More informationThe role of periodicity in the perception of masked speech with simulated and real cochlear implants
The role of periodicity in the perception of masked speech with simulated and real cochlear implants Kurt Steinmetzger and Stuart Rosen UCL Speech, Hearing and Phonetic Sciences Heidelberg, 09. November
More informationUse of Auditory Techniques Checklists As Formative Tools: from Practicum to Student Teaching
Use of Auditory Techniques Checklists As Formative Tools: from Practicum to Student Teaching Marietta M. Paterson, Ed. D. Program Coordinator & Associate Professor University of Hartford ACE-DHH 2011 Preparation
More informationAssessing Hearing and Speech Recognition
Assessing Hearing and Speech Recognition Audiological Rehabilitation Quick Review Audiogram Types of hearing loss hearing loss hearing loss Testing Air conduction Bone conduction Familiar Sounds Audiogram
More informationLearning Process. Auditory Training for Speech and Language Development. Auditory Training. Auditory Perceptual Abilities.
Learning Process Auditory Training for Speech and Language Development Introduction Demonstration Perception Imitation 1 2 Auditory Training Methods designed for improving auditory speech-perception Perception
More informationEssential feature. Who are cochlear implants for? People with little or no hearing. substitute for faulty or missing inner hair
Who are cochlear implants for? Essential feature People with little or no hearing and little conductive component to the loss who receive little or no benefit from a hearing aid. Implants seem to work
More informationSpeech Reading Training and Audio-Visual Integration in a Child with Autism Spectrum Disorder
University of Arkansas, Fayetteville ScholarWorks@UARK Rehabilitation, Human Resources and Communication Disorders Undergraduate Honors Theses Rehabilitation, Human Resources and Communication Disorders
More informationThe functional importance of age-related differences in temporal processing
Kathy Pichora-Fuller The functional importance of age-related differences in temporal processing Professor, Psychology, University of Toronto Adjunct Scientist, Toronto Rehabilitation Institute, University
More informationStudy of perceptual balance for binaural dichotic presentation
Paper No. 556 Proceedings of 20 th International Congress on Acoustics, ICA 2010 23-27 August 2010, Sydney, Australia Study of perceptual balance for binaural dichotic presentation Pandurangarao N. Kulkarni
More informationAnalysis of the Audio Home Environment of Children with Normal vs. Impaired Hearing
University of Tennessee, Knoxville Trace: Tennessee Research and Creative Exchange University of Tennessee Honors Thesis Projects University of Tennessee Honors Program 5-2010 Analysis of the Audio Home
More informationPerceptual Effects of Nasal Cue Modification
Send Orders for Reprints to reprints@benthamscience.ae The Open Electrical & Electronic Engineering Journal, 2015, 9, 399-407 399 Perceptual Effects of Nasal Cue Modification Open Access Fan Bai 1,2,*
More informationPrelude Envelope and temporal fine. What's all the fuss? Modulating a wave. Decomposing waveforms. The psychophysics of cochlear
The psychophysics of cochlear implants Stuart Rosen Professor of Speech and Hearing Science Speech, Hearing and Phonetic Sciences Division of Psychology & Language Sciences Prelude Envelope and temporal
More informationELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION
AVSP 2 International Conference on Auditory-Visual Speech Processing ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION Lynne E. Bernstein, Curtis W. Ponton 2, Edward T. Auer, Jr. House Ear
More informationRESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 24 (2000) Indiana University
COMPARISON OF PARTIAL INFORMATION RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 24 (2000) Indiana University Use of Partial Stimulus Information by Cochlear Implant Patients and Normal-Hearing
More informationWho are cochlear implants for?
Who are cochlear implants for? People with little or no hearing and little conductive component to the loss who receive little or no benefit from a hearing aid. Implants seem to work best in adults who
More informationACOUSTIC ANALYSIS AND PERCEPTION OF CANTONESE VOWELS PRODUCED BY PROFOUNDLY HEARING IMPAIRED ADOLESCENTS
ACOUSTIC ANALYSIS AND PERCEPTION OF CANTONESE VOWELS PRODUCED BY PROFOUNDLY HEARING IMPAIRED ADOLESCENTS Edward Khouw, & Valter Ciocca Dept. of Speech and Hearing Sciences, The University of Hong Kong
More informationThe effect of wearing conventional and level-dependent hearing protectors on speech production in noise and quiet
The effect of wearing conventional and level-dependent hearing protectors on speech production in noise and quiet Ghazaleh Vaziri Christian Giguère Hilmi R. Dajani Nicolas Ellaham Annual National Hearing
More informationHearing Evaluation: Diagnostic Approach
Hearing Evaluation: Diagnostic Approach Hearing Assessment Purpose - to quantify and qualify in terms of the degree of hearing loss, the type of hearing loss and configuration of the hearing loss - carried
More informationWIDEXPRESS. no.30. Background
WIDEXPRESS no. january 12 By Marie Sonne Kristensen Petri Korhonen Using the WidexLink technology to improve speech perception Background For most hearing aid users, the primary motivation for using hearing
More informationInterpreting Speech Results to Optimise Hearing aid Fittings
Interpreting Speech Results to Optimise Hearing aid Fittings Josephine Marriage PhD josephine@chears.co.uk BAA Monday 18 th November 2013 How has the role of families changed with introduction of NHSP?
More informationGick et al.: JASA Express Letters DOI: / Published Online 17 March 2008
modality when that information is coupled with information via another modality (e.g., McGrath and Summerfield, 1985). It is unknown, however, whether there exist complex relationships across modalities,
More informationAudio-Visual Integration: Generalization Across Talkers. A Senior Honors Thesis
Audio-Visual Integration: Generalization Across Talkers A Senior Honors Thesis Presented in Partial Fulfillment of the Requirements for graduation with research distinction in Speech and Hearing Science
More informationSpectrograms (revisited)
Spectrograms (revisited) We begin the lecture by reviewing the units of spectrograms, which I had only glossed over when I covered spectrograms at the end of lecture 19. We then relate the blocks of a
More informationA lip-reading assessment for profoundly deaf patients
The Journal of Laryngology and Otology April 1983. VoL 97. pp. 343-350 A lip-reading assessment for profoundly deaf patients by L. F. A. MARTIN, G. M. CLARK, P. M. SELIGMAN and Y. C. TONG (Melbourne, Australia)
More informationAn Auditory-Model-Based Electrical Stimulation Strategy Incorporating Tonal Information for Cochlear Implant
Annual Progress Report An Auditory-Model-Based Electrical Stimulation Strategy Incorporating Tonal Information for Cochlear Implant Joint Research Centre for Biomedical Engineering Mar.7, 26 Types of Hearing
More informationTemporal offset judgments for concurrent vowels by young, middle-aged, and older adults
Temporal offset judgments for concurrent vowels by young, middle-aged, and older adults Daniel Fogerty Department of Communication Sciences and Disorders, University of South Carolina, Columbia, South
More informationAuditory and Auditory-Visual Lombard Speech Perception by Younger and Older Adults
Auditory and Auditory-Visual Lombard Speech Perception by Younger and Older Adults Michael Fitzpatrick, Jeesun Kim, Chris Davis MARCS Institute, University of Western Sydney, Australia michael.fitzpatrick@uws.edu.au,
More informationAuditory Scene Analysis
1 Auditory Scene Analysis Albert S. Bregman Department of Psychology McGill University 1205 Docteur Penfield Avenue Montreal, QC Canada H3A 1B1 E-mail: bregman@hebb.psych.mcgill.ca To appear in N.J. Smelzer
More informationA Senior Honors Thesis. Brandie Andrews
Auditory and Visual Information Facilitating Speech Integration A Senior Honors Thesis Presented in Partial Fulfillment of the Requirements for graduation with distinction in Speech and Hearing Science
More informationThe Role of Information Redundancy in Audiovisual Speech Integration. A Senior Honors Thesis
The Role of Information Redundancy in Audiovisual Speech Integration A Senior Honors Thesis Printed in Partial Fulfillment of the Requirement for graduation with distinction in Speech and Hearing Science
More informationSpeech conveys not only linguistic content but. Vocal Emotion Recognition by Normal-Hearing Listeners and Cochlear Implant Users
Cochlear Implants Special Issue Article Vocal Emotion Recognition by Normal-Hearing Listeners and Cochlear Implant Users Trends in Amplification Volume 11 Number 4 December 2007 301-315 2007 Sage Publications
More informationHCS 7367 Speech Perception
Babies 'cry in mother's tongue' HCS 7367 Speech Perception Dr. Peter Assmann Fall 212 Babies' cries imitate their mother tongue as early as three days old German researchers say babies begin to pick up
More informationAcoustics, signals & systems for audiology. Psychoacoustics of hearing impairment
Acoustics, signals & systems for audiology Psychoacoustics of hearing impairment Three main types of hearing impairment Conductive Sound is not properly transmitted from the outer to the inner ear Sensorineural
More informationCommunication with low-cost hearing protectors: hear, see and believe
12th ICBEN Congress on Noise as a Public Health Problem Communication with low-cost hearing protectors: hear, see and believe Annelies Bockstael 1,3, Lies De Clercq 2, Dick Botteldooren 3 1 Université
More informationSpeaker s Notes: AB is dedicated to helping people with hearing loss hear their best. Partnering with Phonak has allowed AB to offer unique
1 General Slide 2 Speaker s Notes: AB is dedicated to helping people with hearing loss hear their best. Partnering with Phonak has allowed AB to offer unique technological advances to help people with
More informationAuditory Perception: Sense of Sound /785 Spring 2017
Auditory Perception: Sense of Sound 85-385/785 Spring 2017 Professor: Laurie Heller Classroom: Baker Hall 342F (sometimes Cluster 332P) Time: Tuesdays and Thursdays 1:30-2:50 Office hour: Thursday 3:00-4:00,
More informationChanges in the Role of Intensity as a Cue for Fricative Categorisation
INTERSPEECH 2013 Changes in the Role of Intensity as a Cue for Fricative Categorisation Odette Scharenborg 1, Esther Janse 1,2 1 Centre for Language Studies and Donders Institute for Brain, Cognition and
More informationVisual impairment and speech understanding in older adults with acquired hearing loss
Visual impairment and speech understanding in older adults with acquired hearing loss Jean-Pierre Gagné École d orthophonie et d audiologie, Université de Montréal, Montréal, Québec, Canada H3C3J7 jean-pierre.gagne@umontreal.ca
More informationAssessment of auditory temporal-order thresholds A comparison of different measurement procedures and the influences of age and gender
Restorative Neurology and Neuroscience 23 (2005) 281 296 281 IOS Press Assessment of auditory temporal-order thresholds A comparison of different measurement procedures and the influences of age and gender
More informationAlthough considerable work has been conducted on the speech
Influence of Hearing Loss on the Perceptual Strategies of Children and Adults Andrea L. Pittman Patricia G. Stelmachowicz Dawna E. Lewis Brenda M. Hoover Boys Town National Research Hospital Omaha, NE
More informationBINAURAL DICHOTIC PRESENTATION FOR MODERATE BILATERAL SENSORINEURAL HEARING-IMPAIRED
International Conference on Systemics, Cybernetics and Informatics, February 12 15, 2004 BINAURAL DICHOTIC PRESENTATION FOR MODERATE BILATERAL SENSORINEURAL HEARING-IMPAIRED Alice N. Cheeran Biomedical
More informationREFERRAL AND DIAGNOSTIC EVALUATION OF HEARING ACUITY. Better Hearing Philippines Inc.
REFERRAL AND DIAGNOSTIC EVALUATION OF HEARING ACUITY Better Hearing Philippines Inc. How To Get Started? 1. Testing must be done in an acoustically treated environment far from all the environmental noises
More informationLecture Outline. The GIN test and some clinical applications. Introduction. Temporal processing. Gap detection. Temporal resolution and discrimination
Lecture Outline The GIN test and some clinical applications Dr. Doris-Eva Bamiou National Hospital for Neurology Neurosurgery and Institute of Child Health (UCL)/Great Ormond Street Children s Hospital
More informationHCS 7367 Speech Perception
Long-term spectrum of speech HCS 7367 Speech Perception Connected speech Absolute threshold Males Dr. Peter Assmann Fall 212 Females Long-term spectrum of speech Vowels Males Females 2) Absolute threshold
More informationINTRODUCTION J. Acoust. Soc. Am. 103 (2), February /98/103(2)/1080/5/$ Acoustical Society of America 1080
Perceptual segregation of a harmonic from a vowel by interaural time difference in conjunction with mistuning and onset asynchrony C. J. Darwin and R. W. Hukin Experimental Psychology, University of Sussex,
More informationwhether or not the fundamental is actually present.
1) Which of the following uses a computer CPU to combine various pure tones to generate interesting sounds or music? 1) _ A) MIDI standard. B) colored-noise generator, C) white-noise generator, D) digital
More informationPerceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli
Psychon Bull Rev (2011) 18:123 128 DOI 10.3758/s13423-010-0027-z Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli Shoko Kanaya & Kazuhiko Yokosawa Published
More informationHearing Aids. Bernycia Askew
Hearing Aids Bernycia Askew Who they re for Hearing Aids are usually best for people who have a mildmoderate hearing loss. They are often benefit those who have contracted noise induced hearing loss with
More informationSpeech, Language, and Hearing Sciences. Discovery with delivery as WE BUILD OUR FUTURE
Speech, Language, and Hearing Sciences Discovery with delivery as WE BUILD OUR FUTURE It began with Dr. Mack Steer.. SLHS celebrates 75 years at Purdue since its beginning in the basement of University
More informationAuditory Steady-State Responses and Speech Feature Discrimination in Infants DOI: /jaaa
J Am Acad Audiol 20:629 643 (2009) Auditory Steady-State Responses and Speech Feature Discrimination in Infants DOI: 10.3766/jaaa.20.10.5 Barbara Cone* Angela Garinis{ Abstract Purpose: The aim of this
More informationHello Old Friend the use of frequency specific speech phonemes in cortical and behavioural testing of infants
Hello Old Friend the use of frequency specific speech phonemes in cortical and behavioural testing of infants Andrea Kelly 1,3 Denice Bos 2 Suzanne Purdy 3 Michael Sanders 3 Daniel Kim 1 1. Auckland District
More informationMODALITY, PERCEPTUAL ENCODING SPEED, AND TIME-COURSE OF PHONETIC INFORMATION
ISCA Archive MODALITY, PERCEPTUAL ENCODING SPEED, AND TIME-COURSE OF PHONETIC INFORMATION Philip Franz Seitz and Ken W. Grant Army Audiology and Speech Center Walter Reed Army Medical Center Washington,
More informationSound localization psychophysics
Sound localization psychophysics Eric Young A good reference: B.C.J. Moore An Introduction to the Psychology of Hearing Chapter 7, Space Perception. Elsevier, Amsterdam, pp. 233-267 (2004). Sound localization:
More informationEffects of speaker's and listener's environments on speech intelligibili annoyance. Author(s)Kubo, Rieko; Morikawa, Daisuke; Akag
JAIST Reposi https://dspace.j Title Effects of speaker's and listener's environments on speech intelligibili annoyance Author(s)Kubo, Rieko; Morikawa, Daisuke; Akag Citation Inter-noise 2016: 171-176 Issue
More information3-D Sound and Spatial Audio. What do these terms mean?
3-D Sound and Spatial Audio What do these terms mean? Both terms are very general. 3-D sound usually implies the perception of point sources in 3-D space (could also be 2-D plane) whether the audio reproduction
More informationWhat you re in for. Who are cochlear implants for? The bottom line. Speech processing schemes for
What you re in for Speech processing schemes for cochlear implants Stuart Rosen Professor of Speech and Hearing Science Speech, Hearing and Phonetic Sciences Division of Psychology & Language Sciences
More informationRESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 27 (2005) Indiana University
AUDIOVISUAL ASYNCHRONY DETECTION RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 27 (2005) Indiana University Audiovisual Asynchrony Detection and Speech Perception in Normal-Hearing Listeners
More informationRESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 22 (1998) Indiana University
SPEECH PERCEPTION IN CHILDREN RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 22 (1998) Indiana University Speech Perception in Children with the Clarion (CIS), Nucleus-22 (SPEAK) Cochlear Implant
More informationEssential feature. Who are cochlear implants for? People with little or no hearing. substitute for faulty or missing inner hair
Who are cochlear implants for? Essential feature People with little or no hearing and little conductive component to the loss who receive little or no benefit from a hearing aid. Implants seem to work
More informationSPEECH PERCEPTION IN A 3-D WORLD
SPEECH PERCEPTION IN A 3-D WORLD A line on an audiogram is far from answering the question How well can this child hear speech? In this section a variety of ways will be presented to further the teacher/therapist
More informationThe Use of FM Technology in school-aged children with Autism Spectrum Disorder
The Use of FM Technology in school-aged children with Autism Spectrum Disorder The Use of FM Technology in School-Aged children with Autism Spectrum Disorder Gary Rance, Kerryn Saunders, Peter Carew, Marlin
More informationCategorical Perception
Categorical Perception Discrimination for some speech contrasts is poor within phonetic categories and good between categories. Unusual, not found for most perceptual contrasts. Influenced by task, expectations,
More informationSlow compression for people with severe to profound hearing loss
Phonak Insight February 2018 Slow compression for people with severe to profound hearing loss For people with severe to profound hearing loss, poor auditory resolution abilities can make the spectral and
More informationAn Update on Auditory Neuropathy Spectrum Disorder in Children
An Update on Auditory Neuropathy Spectrum Disorder in Children Gary Rance PhD The University of Melbourne Sound Foundations Through Early Amplification Meeting, Chicago, Dec 2013 Overview Auditory neuropathy
More informationUSING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES
USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES Varinthira Duangudom and David V Anderson School of Electrical and Computer Engineering, Georgia Institute of Technology Atlanta, GA 30332
More informationOutcomes of Paediatric Cochlear implantation in Single-Sided Deafness or very Asymmetrical Hearing Loss (SSD/AHL)
Outcomes of Paediatric Cochlear implantation in Single-Sided Deafness or very Asymmetrical Hearing Loss (SSD/AHL) Karyn Galvin 1, Michelle Todorov 1, Rebecca Farrell 2, Robert Briggs 1,2,3,4, Markus Dahm
More informationPsychoacoustical Models WS 2016/17
Psychoacoustical Models WS 2016/17 related lectures: Applied and Virtual Acoustics (Winter Term) Advanced Psychoacoustics (Summer Term) Sound Perception 2 Frequency and Level Range of Human Hearing Source:
More informationIntroduction to Cochlear Implants, Candidacy Issues, and Impact on Job Functioning. Definitions. Definitions (continued) John P. Saxon, Ph. D.
Introduction to Cochlear Implants, Candidacy Issues, and Impact on Job Functioning John P. Saxon, Ph. D., CRC Definitions Hearing impairment: means any degree and type of auditory disorder. Deafness: means
More informationWhat Parents Should Know About Hearing Aids: What s Inside, How They Work, & What s Best for Kids
What Parents Should Know About Hearing Aids: What s Inside, How They Work, & What s Best for Kids Andrea Pittman, PhD CCC-A Arizona State University AZ Hands and Voices August, 2014 Presentation available
More informationCochlear Implants: The Role of the Early Intervention Specialist. Carissa Moeggenberg, MA, CCC-A February 25, 2008
Cochlear Implants: The Role of the Early Intervention Specialist Carissa Moeggenberg, MA, CCC-A February 25, 2008 Case Scenario 3 month old baby with a confirmed severe to profound HL 2 Counseling the
More informationDemonstration of a Novel Speech-Coding Method for Single-Channel Cochlear Stimulation
THE HARRIS SCIENCE REVIEW OF DOSHISHA UNIVERSITY, VOL. 58, NO. 4 January 2018 Demonstration of a Novel Speech-Coding Method for Single-Channel Cochlear Stimulation Yuta TAMAI*, Shizuko HIRYU*, and Kohta
More informationMultimodal interactions: visual-auditory
1 Multimodal interactions: visual-auditory Imagine that you are watching a game of tennis on television and someone accidentally mutes the sound. You will probably notice that following the game becomes
More informationACOUSTIC AND PERCEPTUAL PROPERTIES OF ENGLISH FRICATIVES
ISCA Archive ACOUSTIC AND PERCEPTUAL PROPERTIES OF ENGLISH FRICATIVES Allard Jongman 1, Yue Wang 2, and Joan Sereno 1 1 Linguistics Department, University of Kansas, Lawrence, KS 66045 U.S.A. 2 Department
More informationSpeech Spectra and Spectrograms
ACOUSTICS TOPICS ACOUSTICS SOFTWARE SPH301 SLP801 RESOURCE INDEX HELP PAGES Back to Main "Speech Spectra and Spectrograms" Page Speech Spectra and Spectrograms Robert Mannell 6. Some consonant spectra
More informationFREQUENCY. Prof Dr. Mona Mourad Dr.Manal Elbanna Doaa Elmoazen ALEXANDRIA UNIVERSITY. Background
FREQUENCY TRANSPOSITION IN HIGH FREQUENCY SNHL Prof Dr. Mona Mourad Dr.Manal Elbanna Doaa Elmoazen Randa Awad ALEXANDRIA UNIVERSITY Background Concept Of Frequency Transposition Frequency transposition
More informationOptimal Design: Bekesy Test for Mobile
Optimal Design: Bekesy Test for Mobile By Samir Dahmani Team 2: Joseph Wolanski, Nihit Mody, Samir Dahmani Client 19: Douglas Oliver, 263 Farmington Avenue, Farmington September 19 th 2012 1. Optimal Design
More informationILLUSIONS AND ISSUES IN BIMODAL SPEECH PERCEPTION
ISCA Archive ILLUSIONS AND ISSUES IN BIMODAL SPEECH PERCEPTION Dominic W. Massaro Perceptual Science Laboratory (http://mambo.ucsc.edu/psl/pslfan.html) University of California Santa Cruz, CA 95064 massaro@fuzzy.ucsc.edu
More informationWhat is sound? Range of Human Hearing. Sound Waveforms. Speech Acoustics 5/14/2016. The Ear. Threshold of Hearing Weighting
Speech Acoustics Agnes A Allen Head of Service / Consultant Clinical Physicist Scottish Cochlear Implant Programme University Hospital Crosshouse What is sound? When an object vibrates it causes movement
More informationHearing II Perceptual Aspects
Hearing II Perceptual Aspects Overview of Topics Chapter 6 in Chaudhuri Intensity & Loudness Frequency & Pitch Auditory Space Perception 1 2 Intensity & Loudness Loudness is the subjective perceptual quality
More informationIntegral Processing of Visual Place and Auditory Voicing Information During Phonetic Perception
Journal of Experimental Psychology: Human Perception and Performance 1991, Vol. 17. No. 1,278-288 Copyright 1991 by the American Psychological Association, Inc. 0096-1523/91/S3.00 Integral Processing of
More information