NIH Public Access Author Manuscript Curr Dir Psychol Sci. Author manuscript; available in PMC 2010 August 18.

Similar documents
The Importance of Sound for Cognitive Sequencing Abilities

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 28 (2007) Indiana University

Feasibility and Efficacy of Working Memory Training in Children with Cochlear Implants: A First Report

NIH Public Access Author Manuscript Cochlear Implants Int. Author manuscript; available in PMC 2012 March 15.

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 26 ( ) Indiana University

Short-Term and Working Memory Impairments in Early-Implanted, Long-Term Cochlear Implant Users Are Independent of Audibility and Speech Production

The significance of sensory motor functions as indicators of brain dysfunction in children

Computational Explorations in Cognitive Neuroscience Chapter 7: Large-Scale Brain Area Functional Organization

Hearing the Universal Language: Music and Cochlear Implants

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 25 ( ) Indiana University

MULTI-CHANNEL COMMUNICATION

Efficacy and Effectiveness of Cochlear Implants in Deaf Children

There are often questions and, sometimes, confusion when looking at services to a child who is deaf or hard of hearing. Because very young children

It is also possible to have a mixed hearing loss, which arises from both the above.

Chapter 6. Attention. Attention

What is sound? Range of Human Hearing. Sound Waveforms. Speech Acoustics 5/14/2016. The Ear. Threshold of Hearing Weighting

Higher Cortical Function

Technical Report #2 Testing Children Who Are Deaf or Hard of Hearing

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 26 ( ) Indiana University

COMPLEX LEARNING DIFFICULTIES AND DISABILITIES RESEARCH PROJECT (CLDD)

Analysis of the Audio Home Environment of Children with Normal vs. Impaired Hearing

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 26 ( ) Indiana University

Chapter 5. Summary and Conclusions! 131

Running head: HEARING-AIDS INDUCE PLASTICITY IN THE AUDITORY SYSTEM 1

Learning Objectives.

Reminders. What s a Neuron? Animals at Birth. How are Neurons formed? Prenatal Neural Development. Week 28. Week 3 Week 4. Week 10.

I. Language and Communication Needs

Deafness Signed Language and Cochlear Implants

AN EPIC COMPUTATIONAL MODEL OF VERBAL WORKING MEMORY D. E. Kieras, D. E. Meyer, S. T. Mueller, T. L. Seymour University of Michigan Sponsored by the

TExES Deaf and Hard-of-Hearing (181) Test at a Glance

3/23/2017 ASSESSMENT AND TREATMENT NEEDS OF THE INDIVIDUAL WITH A TRAUMATIC BRAIN INJURY: A SPEECH-LANGUAGE PATHOLOGIST S PERSPECTIVE

Early Educational Placement and Later Language Outcomes for Children With Cochlear Implants

SYLLABUS FOR PH.D ENTRANCE TEST IN SPEECH AND HEARING

Plasticity of Cerebral Cortex in Development

Beyond the Psychologist s Report. Nancy Foster, PhD Institute for Brain-Behavior Integration

Revisited. Choices in Deafness: CDC Teleconference August 23, Mary E. Koch, MA, CED

Communication Options and Opportunities. A Factsheet for Parents of Deaf and Hard of Hearing Children

An Update on Auditory Neuropathy Spectrum Disorder in Children

June David S. Martin Professor/Dean Emeritus Gallaudet University

NCERT Solutions Class 11 Psychology. Chapter - The Bases of Human Behaviour

Speaker s Notes: AB is dedicated to helping people with hearing loss hear their best. Partnering with Phonak has allowed AB to offer unique

Interference with spatial working memory: An eye movement is more than a shift of attention

Wheaton Journal of Neuroscience Senior Seminar Research

Multi-modality in Language. Bencie Woll Deafness Cognition and Language Research Centre, UCL

Some methodological aspects for measuring asynchrony detection in audio-visual stimuli

Cochlear Implants: The Role of the Early Intervention Specialist. Carissa Moeggenberg, MA, CCC-A February 25, 2008

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 22 (1998) Indiana University

Spectrograms (revisited)

The Deaf Brain. Bencie Woll Deafness Cognition and Language Research Centre

Understanding the Child s Individual Profile: Sensory Processing

ID# Exam 2 PS 325, Fall 2003

Measuring Auditory Performance Of Pediatric Cochlear Implant Users: What Can Be Learned for Children Who Use Hearing Instruments?

***This is a self-archiving copy and does not fully replicate the published version*** Auditory Temporal Processes in the Elderly

Title:Atypical language organization in temporal lobe epilepsy revealed by a passive semantic paradigm

Critical Review: Speech Perception and Production in Children with Cochlear Implants in Oral and Total Communication Approaches

HCSCGS16: Introduction to Speech, Hearing and Audiology: Part 2 (Academic Year 2014/15)

Neurobiology Biomed 509 Sensory transduction References: Luo , ( ), , M4.1, M6.2

THE ROLE OF VISUAL SPEECH CUES IN THE AUDITORY PERCEPTION OF SYNTHETIC STIMULI BY CHILDREN USING A COCHLEAR IMPLANT AND CHILDREN WITH NORMAL HEARING

COSD UNDERGRADUATE COURSES

Does Wernicke's Aphasia necessitate pure word deafness? Or the other way around? Or can they be independent? Or is that completely uncertain yet?

1) Drop off in the Bi 150 box outside Baxter 331 or to the head TA (jcolas).

Audiology Curriculum Post-Foundation Course Topic Summaries

Homework Week 2. PreLab 2 HW #2 Synapses (Page 1 in the HW Section)

Studying the time course of sensory substitution mechanisms (CSAIL, 2014)

The Physiology of Learning

Wetware: The Biological Basis of Intellectual Giftedness

Elizabeth Adams Costa, PhD Kate Maina, MHS Nancy Mellon, MS Christine Mitchell, MS Meredith Ouellette, MS Sharlene Wilson Ottley, PhD

Deaf Children and Young People

Critical Review: The Impact of Structured Auditory Training on Musical Pitch and Melody Perception in Individuals with a Unilateral Cochlear Implant

Introduction to the Special Issue on Multimodality of Early Sensory Processing: Early Visual Maps Flexibly Encode Multimodal Space

Sensorimotor Functioning. Sensory and Motor Systems. Functional Anatomy of Brain- Behavioral Relationships

As people age, there is a decline in the ability to understand speech

EHDI 2015 Louisville, KY Patricia E. Spencer Marilyn Sass-Lehrer Gallaudet University, Washington, DC

Curriculum Vitae Ryan A. Stevenson

Using Neuropsychological Experts. Elizabeth L. Leonard, PhD

Multimodal interactions: visual-auditory

A note of compassion, and a note of skepticism

SENATE, No STATE OF NEW JERSEY. 217th LEGISLATURE INTRODUCED MARCH 14, 2016

Excellent Network Courses. Department of Neurology Affiliated hospital of Jiangsu University

Why is dispersion of memory important*

TMS Disruption of Time Encoding in Human Primary Visual Cortex Molly Bryan Beauchamp Lab

(Thomas Lenarz) Ok, thank you, thank you very much for inviting me to be here and speak to you, on cochlear implant technology.

Developing spoken language through listening Ellen L Estes, M.S. LSLS Cert AVEd

RESEARCH ON SPOKEN LANGUAGE PROCESSING Progress Report No. 26 ( )

Chapter 3 - Deaf-Blindness

Livingston American School Quarterly Lesson Plan

A model of parallel time estimation

The previous three chapters provide a description of the interaction between explicit and

Introduction to Physiological Psychology Review

Effects of Auditory Input on a Spatial Serial Response Time Task

Cochlear Implants. What is a Cochlear Implant (CI)? Audiological Rehabilitation SPA 4321

On the nature of Rhythm, Time & Memory. Sundeep Teki Auditory Group Wellcome Trust Centre for Neuroimaging University College London

Criteria Supporting Features Remarks and Explanations

The concept of neural network in neuropsychology

Connecting Attention, Memory. Attention + Memory= LEARNING

Child Caregiver Relationships An Interdisciplinary Presentation

SLHS1402 The Talking Brain

Bilateral Cochlear Implant Guidelines Gavin Morrison St Thomas Hearing Implant Centre London, UK

TITLE:&The"organisation"of"working"memory"networks"is"shaped"by"early"sensory"experience"

Cognitive Neuroscience Cortical Hemispheres Attention Language

Transcription:

NIH Public Access Author Manuscript Published in final edited form as: Curr Dir Psychol Sci. 2009 October ; 18(5): 275 279. doi:10.1111/j.1467-8721.2009.01651.x. The Importance of Sound for Cognitive Sequencing Abilities: The Auditory Scaffolding Hypothesis Christopher M. Conway 1, Saint Louis University David B. Pisoni, and Indiana University, Bloomington, Indiana University School of Medicine William G. Kronenberger Indiana University School of Medicine Abstract Sound is inherently a temporal and sequential signal. Experience with sound therefore may help bootstrap i.e., provide a kind of scaffolding for the development of general cognitive abilities related to representing temporal or sequential patterns. Accordingly, the absence of sound early in development may result in disturbances to these sequencing skills. In support of this hypothesis we present two types of findings. First, normal-hearing adults do best on sequencing tasks when the sense of hearing, rather than vision, can be used. Second, recent findings suggest that deaf children have disturbances on exactly these same kinds of tasks that involve learning and manipulation of serial order information. We suggest that sound provides an auditory scaffolding for time and serial order behavior, possibly mediated through neural connections between the temporal and frontal lobes of the brain. Under conditions of auditory deprivation, auditory scaffolding is absent, resulting in neural reorganization and a disturbance to cognitive sequencing abilities. Keywords sound; deafness; sequence learning; language; prefrontal cortex It is customary to consider sound as being the province of auditory perception alone. However, recent findings and theory have emphasized the interactive nature of the sensory modalities as well as the ways in which sensory processing underlies higher cognition. For example, multisensory processing, in which multiple senses (vision, audition, touch) are used in concert, is beginning to be regarded as the norm, not the exception to perception. Furthermore, embodied cognition theories, that stress the close coupling of brain, body, and sensory systems, emphasize the importance of understanding how the dynamics of modality-specific constraints affect higher level cognition such as learning and memory. To put it another way: because the brain is an integrated functional system, sensory processing (and, by extension, the effects of sensory deprivation) are not completely independent from the rest of neurocognition and thus may have secondary effects on the brain and cognition as a whole. 1 Address correspondence to Christopher M. Conway, 3511 Laclede Ave., Department of Psychology, Saint Louis University, St. Louis, MO 63103, Ph: 314-977-2299, Fax: 314-977-1014, cconway6@slu.edu.

Conway et al. Page 2 Sound in particular is a temporal and sequential signal, one in which time and serial order are of primary importance (Hirsh, 1967). Because of this quality of sound, we argue that hearing provides vital exposure to serially ordered events, bootstrapping the development of sequential processing and behavior. Sound thus provides a scaffolding a supporting framework which organisms use to learn how to interpret and process sequential information. The auditory scaffolding hypothesis is backed by two lines of evidence: modality-specific constraints in hearing populations, and non-auditory sequencing abilities in the congenitally deaf. MODALITY CONSTRAINTS Sequential Learning Previous research has suggested that the sensory modality used in a particular task constrains cognitive functioning. Specifically, for any task that requires the perception, learning, or memory of events where their order or timing is important, people do best when they can rely on hearing. Consider sequences of light flashes or auditory tones that occur at varying rates of presentation. Adults can more accurately perceive and reproduce the auditory compared to the visual patterns (Collier & Logan, 2000). Furthermore, it is a wellestablished phenomenon that short-term verbal memory is better with auditory presentation (i.e., spoken words) than for visual presentation (i.e., written words) (Penney, 1989). To explain some of these findings, Glenberg and Jona (1991) argued that the coding of time for auditory events is more accurate than for visual events. More recently, an auditory superiority effect was found with sequential learning (Conway & Christiansen, 2005). Three groups of participants were exposed to auditory, visual, or tactile sequential patterns (see Figure 1). All stimuli were nonlinguistic and thus were not easy to verbalize. Participants were not told that the sequences they were exposed to were generated by an artificial grammar that determined the order in which each stimulus could occur in a sequence. For example, a grammar might dictate that tone 1 can follow tone 2 and tone 3 each 50% of the time, and can never come after tones 4 or 5. Following incidental exposure to the sequential patterns, participants were next exposed to novel patterns in the same sense modality as before, but this time half of the sequences were generated from the same grammar while the other half violated the rules of the grammar in some way. Participants were instructed to classify each sequence in terms of whether it was generated from the same grammar or not. The results on the grammaticality classification task revealed that the magnitude of auditory learning (75% items correct) was much greater than either tactile or visual learning (62% each). In fact, because in this task 50% is considered chance performance (no learning), a score of 75% (75 50=25) signifies twice as much learning as a score of 62% (62 50 = 12). Furthermore, additional work has shown that whereas visual sequence learning declines at fast rates of presentation (8 stimuli/sec), auditory sequence learning is robust even in the face of fast presentation rates (Conway & Christiansen, in press). Taken together with previous work showing modality-specific constraints in perception and memory, the sequence learning findings suggest that the auditory modality excels at encoding and processing temporal and sequential relations in the environment. SEQUENCING SKILLS IN THE DEAF Although it is common to consider deafness as affecting the sense of hearing alone, we argue that because sound is the primary gateway to understanding temporal and sequential

Conway et al. Page 3 Some Recent Findings events, auditory deprivation may result in significant disturbances on a wide range of other tasks (Conway, Karpicke, & Pisoni, 2007). For instance, Bavelier, Dye, and Hauser (2006) have argued that deafness results in a reorganization of cortical function. Therefore, losing the sense of audition early in development may set up a cascade of complex effects that alter a child s entire suite of perceptual and cognitive abilities, not just those directly related to hearing and the processing of acoustic signals. According to the auditory scaffolding hypothesis, deafness may especially affect cognitive abilities related to learning, recalling, and producing sequential information. A delay or disorder in domain-general sequencing skills, triggered by lack of auditory stimulation at an early age, could provide a significant impediment to normal development. Indeed, previous work suggests that the profoundly deaf show disturbances in (non-auditory) functions related to time and serial order, including immediate serial recall (Marschark, 2006). Recent findings from our research group tested sequencing skills in two groups of children aged 5 10 years old: deaf children with cochlear implants (CIs) and an age-matched hearing group. A CI is an electronic device that directly stimulates the auditory nerve to create the percept of hearing. Children s motor sequencing abilities were assessed using a fingertip tapping task from the NEPSY: A Developmental Neuropsychological Assessment (Korkman, Kirk, & Kemp, 1998). One version of the task required the child to repetitively tap together his/her thumb and index finger as fast as possible. Another version required the child to tap the tip of his/her thumb to the index, middle, ring and pinky finger in that order as quickly as possible. Overall, the deaf children with CIs performed worse than the control group and were also atypical relative to the published normative data (Conway et al., 2009a). Importantly, the deaf children were not impaired on several other non-sequencing tasks, such as visual-spatial memory and tactile perception. We also tested the children with a visual sequential learning task similar to the methodology discussed previously with adults (Conway, Pisoni, Anaya, Karpicke, & Henning, 2009b). In this task, the children were exposed to sequences of colored squares displayed on a touchsensitive screen. The children were required to remember the sequence of colors on the screen and then reproduce each one by pressing the touch-screen panels in the correct order. Note that because each color is uniquely associated with a particular location on the screen, a child may be remembering a sequence of colors, a sequence of locations, or both. Unbeknownst to the children, all of the sequences initially were generated from an artificial grammar. After completing the reproduction task for a subset of visual sequences, the experiment seamlessly transitioned to a test phase, which consisted of new sequences generated from the same grammar and new sequences generated from a novel grammar. For each subject a learning score was calculated, which measured the extent that sequence memory spans improved for sequences generated from the same grammar compared to the novel grammar. The results showed that the normal-hearing children s sequence learning score was significantly greater than the deaf children s score, who on average showed no learning. Furthermore, whereas 73% of the normal hearing sample showed some effect of implicit sequence learning, only 39% of deaf children did. The results from the fingertip tapping and sequential learning tasks demonstrate that deaf children show atypical motor and visual sequence learning compared to age-matched normal-hearing children. What both tasks have in common is that they require facility with encoding or producing sequential patterns. Taken together, these findings suggest that a

Conway et al. Page 4 period of deafness early in development may cause secondary disturbances to non-auditory sequencing skills. Implications for Language Development Figure 2 presents a general framework for the interactive relationship between sound, sequencing skills, and spoken language development. Clearly, a lack of sound hinders the development of spoken language. However, this framework depicts an additional effect of auditory deprivation: delays in non-auditory sequencing functions. Even for those deaf children who have a CI surgically implanted and therefore receive sound via electrical stimulation, cognitive and perceptual sequencing skills appear to be delayed and/or reorganized. Delays in sequence learning likely contribute to problems learning the complex grammatical patterns of spoken language (Ullman, 2004). That is, a deaf child with a CI may be able to accurately hear (i.e., detect and discriminate) individual sounds occurring in sequence (i.e., words), but may have difficulties performing higher-level cognitive operations on those sounds (i.e., learning the sequential regularities of words in spoken language) that form the basis of grammatical knowledge. This implies that deaf children who receive new auditory input via a CI and who appear to be well on the path to developing spoken language abilities may in fact have comorbid delays in sequencing functions that contribute to difficulties with lcertaom aspects of language development. An association between sequence learning skill and language has been observed empirically (Conway et al., 2007; 2009a; 2009b). POSSIBLE MECHANISMS We consider here two possible ways in which these effects may be instantiated. One possibility is that listening to sounds, especially those that are easy to verbalize, provides an opportunity to automatically imitate (i.e., vocally rehearse) what is heard, either overtly or covertly. Imitating what is heard provides a discrete verbal label to a continuous auditory signal, providing anchor points for learning associations among the discrete symbols. Under this embodied account, hearing thus recruits vocal rehearsal processes that presumably strengthen the development of domain-general implicit sequence learning abilities. A second possible mechanism relies on the fact that all environmental input likely includes modalityneutral information in addition to the modality-specific signal itself (c.f., Rosenblum, 2008). Sound, compared to vision, may specifically carry higher-level patterns of information related to temporal change and serial order. Under this view, hearing is the primary gateway for perceiving high-level sequential patterns of input that change over time (rather than over space). The development of fundamental sequence learning mechanisms would thus be delayed when this type of input is unavailable, as is the case in deafness. Both the embodied and the modality-neutral accounts, which are not necessarily mutually exclusive, are consistent with neurobiological data suggesting cortical reorganization in the deaf, especially in the prefrontal cortex. This region of the frontal lobe plays a critical role in learning, planning, and executing sequences of thoughts and actions (Fuster, 2001). Electrophysiological data suggest that deaf children compared to hearing peers show decreased cerebral maturation in left fronto-temporal regions and bilateral frontal regions (Wolff & Thatcher, 1990). A lack of auditory input due to deafness may reduce auditory-frontal connectivity (Emmorey et al., 2003), fundamentally altering the neural organization of the frontal lobe and crucially, the prefrontal cortex. Delayed cortical maturation in this region could have significant effects on the development of cognitive and motor sequencing skills used in language and other aspects of cognitive processing.

Conway et al. Page 5 One additional implication of delayed maturation in the frontal lobe is that deaf children may also display difficulties with some types of abilities known as executive functions, which include domain-general control processes such as working memory, response inhibition, self-regulation, and goal-oriented behavior (see relevant chapters in Marschark & Hauser, 2008). Thus, it may be possible that a period of profound deafness results in widespread disturbances to frontal lobe-related executive functions more generally, with sequence learning being just one behavioral indication. SUMMARY AND OUTSTANDING QUESTIONS Sound appears to provide a perceptual and cognitive scaffolding for the development of functions related to time and serial order behavior. Experience with perceiving and producing sound helps organisms learn how to encode and manipulate sequential information, while a lack of auditory stimulation hinders the development of these skills. Although we believe that the auditory scaffolding hypothesis currently best explains the findings presented, much work remains in order to rule out alternative explanations. One possibility is that the disturbances in sequencing functions that have been observed in the deaf are not due to auditory deprivation per se, but rather to differences in the social environments of deaf and hearing children. A deaf child s sociocultural environment may be very different from that of a hearing child, due to differences in educational opportunities, parenting styles, and other social, communicative, and emotional factors (Marschark & Hauser, 2008). It is currently underspecified how such social environmental factors may affect cognitive development, and specifically, sequence learning abilities. Another possibility is that lack of experience and skill with spoken language specifically not with hearing more generally -- affects cognitive sequencing abilities. For instance, it is known that deaf children have particular difficulties with sequence learning and memory when the input is verbal or easily coded in verbal form (Dawson, Busby, McKay, & Clark, 2002). However, as we have reviewed, deafness also appears to affect even non-auditory and nonverbal sequencing skills, suggesting that impaired performance by deaf individuals on sequence learning tasks is not merely due to difficulties with processing verbal information. To help disambiguate the multiple factors that may be at play in cognitive development in both hearing and hearing-impaired populations, there is a need to carefully study specific aspects of cognitive and neural development in hearing, hearing-impaired, and deaf children with and without CIs. Additional ways of investigating the impact of sensory deprivation on brain and cognition include: animal studies (through destruction of the peripheral sensory pathways); brain imaging and electrophysiological techniques to study cortical reorganization in the deaf; and even the use of neural networks to model the development of sequence learning in the face of reduced sequential input. In summary, we believe that when all findings are taken together, the auditory scaffolding hypothesis is useful because it integrates findings in both hearing and hearing-impaired populations, providing novel and testable predictions. The findings and theory reviewed suggest that the role of sound in cognition goes far beyond domain-specific auditory perception. Sound bootstraps the development of cognitive processes that rely on the encoding, learning, and manipulation of information and behaviors occurring in sequence. Because sequencing abilities directly impact many aspects of cognition including perception, sensory-motor control, language, and higher-level functions, these findings have profound implications for understanding a wide range of issues related to neurocognitive development and plasticity in normal-hearing, hearing-impaired, and language disturbed populations.

Conway et al. Page 6 Acknowledgments References This work was supported by the following grants from the National Institutes of Health: R03DC009485, T32DC00012, 5R01DC00111, and 2R01DC000064. We wish to thank Arthur Glenberg for suggesting the embodied account as a possible mechanism underlying the auditory scaffolding effects. Bavelier D, Dye MWG, Hauser PC. Do deaf individuals see better? Trends in Cognitive Sciences. 2006; 10:512 518. [PubMed: 17015029] Collier GL, Logan G. Modality differences in short-term memory for rhythms. Memory & Cognition. 2000; 28:529 538. Conway CM, Christiansen MH. Seeing and hearing in space and time: Effects of modality and presentation rate on implicit statistical learning. European Journal of Cognitive Psychology. (in press). Conway CM, Christiansen MH. Modality-constrained statistical learning of tactile, visual, and auditory sequences. Journal of Experimental Psychology: Learning, Memory, & Cognition. 2005; 31:24 39. Conway CM, Karpicke J, Anaya EM, Henning SC, Kronenberger WK, Pisoni DB. Nonverbal cognition in deaf children following cochlear implantation: Motor sequencing disturbances mediate language delays. 2009a Manuscript under review. Conway CM, Karpicke J, Pisoni DB. Contribution of implicit sequence learning to spoken language processing: Some preliminary findings with hearing adults. Journal of Deaf Studies and Deaf Education. 2007; 12:317 334. [PubMed: 17548805] Conway CM, Pisoni DB, Anaya EM, Karpicke J, Henning SC. Implicit sequence learning in deaf children with cochlear implants. 2009b Manuscript under revision. Dawson PW, Busby PA, McKay CM, Clark GM. Short-term auditory memory in children using cochlear implants and its relevance to receptive language. Journal of Speech, Language, and Hearing Research. 2002; 45:789 801. Emmorey K, Allen JS, Bruss J, Schenker N, Damasio H. A morphometric analysis of auditory brain regions in congentially deaf adults. Proceedings of the National Academy of Sciences. 2003; 100:10049 10054. Fuster J. The prefrontal cortex-- an update: Time is of the essence. Neuron. 2001; 30:319 333. [PubMed: 11394996] Glenberg AM, Jona M. Temporal coding in rhythm tasks revealed by modality effects. Memory & Cognition. 1991; 19:514 522. Hauser, PC.; Lukomski, J.; Hillman, T. Development of deaf and hard-of-hearing students executive function. In: Marschark, M.; Hauser, P., editors. Deaf cognition: Foundations and outcomes. New York: Oxford University Press; 2008. p. 286-308. Hirsh, IJ. Information processing in input channels for speech and language: The significance of serial order of stimuli. In: Darley, FL., editor. Brain mechanisms underlying speech and language. New York: Grune & Stratton; 1967. p. 21-38. Korkman, M.; Kirk, U.; Kemp, S. NEPSY: A developmental neuropsychological assessment. San Antonio, TX: Psychological Corporation; 1998. Marschark M. Intellectual functioning of deaf adults and children: Answers and questions. European Journal of Cognitive Psychology. 2006; 18:70 89. Marschark, M.; Hauser, P. Deaf cognition: Foundations and outcomes. New York: Oxford University Press; 2008. Penney CG. Modality effects and the structure of short-term verbal memory. Memory & Cognition. 1989; 17:398 422. Rosenblum LD. Speech perception as a multimodal phenomenon. Current Directions in Psychological Science. 2008; 17:405 409. Ullman MT. Contributions of memory circuits to language: The declarative/procedural model. Cognition. 2004; 92:231 270. [PubMed: 15037131]

Conway et al. Page 7 Wolff AB, Thatcher RW. Cortical reorganization in deaf children. Journal of Clinical and Experimental Neuropsychology. 1990; 12:209 221. [PubMed: 2341551] RECOMMENDED READINGS 1. Conway CM, Pisoni DB. Neurocognitive basis of implicit learning of sequential structure and its relation to language processing. Annals of the New York Academy of Sciences. 2008; 1145:113 131. A review of sequential learning, its role in language, possible neurobiological substrates, and differences in auditory versus visual processing. [PubMed: 19076393] 2. Marschark M, Hauser P. 2008 See reference list. An edited volume that highlights recent findings in cognitive abilities of the deaf, including language development, learning and memory, attention, numerical cognition, and executive functions. 3. Myklebust HR, Brutten M. A study of the visual perception of deaf children. Acta Oto- Laryngologica Supplementum. 1953; 105:1 126. An early study reporting that deaf children differ on a variety of measures of visual perception, concluding that sensory deprivation affects the development of the entire organism, not just modality-specific processing. [PubMed: 13040006] 4. Lashley, KS. The problem of serial order in behavior. In: Jeffress, LA., editor. Cerebral mechanisms in behavior. New York: Wiley; 1951. p. 112-146.Classic paper arguing for the importance of serial ordering and sequencing abilities in language and other skillful behaviors 5. Pisoni DB. Cognitive factors and cochlear implants: Some thoughts on perception, learning, and memory in speech perception. Ear & Hearing. 2000; 21:70 78. Argues that research on cochlear implants in children needs to shift from an emphasis on the study of audiological and demographic factors to the investigation of neural, cognitive, and psychological processes that mediate successful language outcome. [PubMed: 10708075]

Conway et al. Page 8 Figure 1. Tactile, visual, and auditory sequences used in Conway & Christiansen (2005) to investigate possible modality differences in implicit sequential learning. Tactile stimulation was accomplished via vibrotactile pulses delivered to participants five fingers of one hand. Visual sequences consisted of black squares appearing at different spatial locations, one at a time. Auditory sequences were comprised of tone patterns.

Conway et al. Page 9 Figure 2. Proposed framework for understanding the relations among sound, cognitive sequencing, and spoken language development. The solid lines represent effects among these factors that are consistent with the present findings. The dotted line represents the possibility of an additional but as yet unspecified influence: spoken language skills affecting the development of general cognitive sequencing abilities.