Disparity of Non-verbal Language Learning

Similar documents
MULTI-CHANNEL COMMUNICATION

Deafness Signed Language and Cochlear Implants

Psy /16 Human Communication. By Joseline

Sign Language and Early Childhood Development

The Deaf Brain. Bencie Woll Deafness Cognition and Language Research Centre

shows syntax in his language. has a large neocortex, which explains his language abilities. shows remarkable cognitive abilities. all of the above.

fmri (functional MRI)

Deaf Children and Young People

Communication Options and Opportunities. A Factsheet for Parents of Deaf and Hard of Hearing Children

Ways we Study the Brain. Accidents Lesions CAT Scan PET Scan MRI Functional MRI

Psychology of Language

Define functional MRI. Briefly describe fmri image acquisition. Discuss relative functional neuroanatomy. Review clinical applications.

Psych 56L/ Ling 51: Acquisition of Language

I. Language and Communication Needs

Explainer: This is your brain

Cognitive Neuroscience Cortical Hemispheres Attention Language

Social and Pragmatic Language in Autistic Children

Myers Psychology for AP*

The neural basis of sign language processing

Multi-modality in Language. Bencie Woll Deafness Cognition and Language Research Centre, UCL

PsychoBrain. 31 st January Dr Christos Pliatsikas. Lecturer in Psycholinguistics in Bi-/Multilinguals University of Reading

Alternative Augmentative Communication. American Sign Language

Does Wernicke's Aphasia necessitate pure word deafness? Or the other way around? Or can they be independent? Or is that completely uncertain yet?

Brain and Behavior Lecture 13

to the child and the family, based on the child's and family's abilities and needs. The IFSP needs to address the communication needs of the child and

How do individuals with congenital blindness form a conscious representation of a world they have never seen? brain. deprived of sight?

Memory Development. Cognitive Development

Wetware: The Biological Basis of Intellectual Giftedness

Cued Speech and Cochlear Implants: Powerful Partners. Jane Smith Communication Specialist Montgomery County Public Schools

Technical Report #2 Testing Children Who Are Deaf or Hard of Hearing

LIGN171: Child Language Acquisition Developmental Disorders affecting language

1. INTRODUCTION. Vision based Multi-feature HGR Algorithms for HCI using ISL Page 1

Carnegie Mellon University Annual Progress Report: 2011 Formula Grant

Why does language set up shop where it does?

Special Education: Contemporary Perspectives for School Professionals Fourth Edition. Marilyn Friend. Kerri Martin, Contributor

Advances in Clinical Neuroimaging

Utilizing Music and Movement Treatment Strategies to Develop Social Communication Skills

Excellent Network Courses. Department of Neurology Affiliated hospital of Jiangsu University

EXECUTIVE SUMMARY Academic in Confidence data removed

Lecture 35 Association Cortices and Hemispheric Asymmetries -- M. Goldberg

Hemispheric Specialization (lateralization) Each lobe of the brain has specialized functions (Have to be careful with this one.)

What is sound? Range of Human Hearing. Sound Waveforms. Speech Acoustics 5/14/2016. The Ear. Threshold of Hearing Weighting

Bimodal bilingualism: focus on hearing signers

Neuroimaging methods vs. lesion studies FOCUSING ON LANGUAGE

0-3 DEVELOPMENT. By Drina Madden. Pediatric Neuropsychology 1

College of Health Sciences. Communication Sciences and Disorders

Homework Week 2. PreLab 2 HW #2 Synapses (Page 1 in the HW Section)

P. Hitchcock, Ph.D. Department of Cell and Developmental Biology Kellogg Eye Center. Wednesday, 16 March 2009, 1:00p.m. 2:00p.m.

How We Grow & Change

From: What s the problem? Pathway to Empowerment. Objectives 12/8/2015

Brain and behaviour (Wk 6 + 7)

One or more genetic anomalies that affect brain development, such as Down syndrome

Speech perception in individuals with dementia of the Alzheimer s type (DAT) Mitchell S. Sommers Department of Psychology Washington University

Early Identification & Treatment of Autism United Cerebral Palsy November 18, 2013 Joe Ackerson, Ph.D. Donna Murdaugh, M.A.

SLP s Personal Shift in Language Pedagogy as a Parent of a Deaf Child Benefits of Speech and Sign

Why does language set up shop where it does?

College of Health Sciences. Communication Sciences and Disorders

Title:Atypical language organization in temporal lobe epilepsy revealed by a passive semantic paradigm

synapse neurotransmitters Extension of a neuron, ending in branching terminal fibers, through which messages pass to other neurons, muscles, or glands

Physiology Unit 2 CONSCIOUSNESS, THE BRAIN AND BEHAVIOR

Specific Sulci/Fissures:

Director of Testing and Disability Services Phone: (706) Fax: (706) E Mail:

Deaf children as bimodal bilinguals and educational implications

College of Education and Human Services Exceptional Student & Deaf Education Course Descriptions

CISC 3250 Systems Neuroscience

PSY 215 Lecture 17 (3/28/2010) (Lateralization in the Brain) Dr. Achtman PSY 215

Brain Structures. Some scientists divide the brain up into three parts. Hindbrain Midbrain Forebrain

The Emergence of Grammar: Systematic Structure in a New Language. Article Summary

Psychology Formative Assessment #2 Answer Key

The Nervous System. Divisions of the Nervous System. Branches of the Autonomic Nervous System. Central versus Peripheral

Date: April 19, 2017 Name of Product: Cisco Spark Board Contact for more information:

The BSL Sentence Reproduction Test: Exploring age of acquisition effects in British deaf adults

Avg. age of diagnosis 3 mo. majority range.5-5 mo range 1-7 mo range 6-12 mo

CHILDREN WITH CMV: DON T FORGET THE IMPORTANCE OF EARLY INTERVENTION. Paula Pittman, PhD Director, Utah Parent Infant Program for the Deaf

CEREBRUM. Dr. Jamila EL Medany

By Lauren Stowe, PhD, CCC-SLP & Gina Rotondo, MS, CCC-SLP The Speech Therapy Group

Cognitive Neuroscience Section 8

Running head: HEARING-AIDS INDUCE PLASTICITY IN THE AUDITORY SYSTEM 1

Hearing Impaired K 12

Critical Review: Speech Perception and Production in Children with Cochlear Implants in Oral and Total Communication Approaches

3/20/13. :: Slide 1 :: :: Slide 39 :: How Is the Nervous System Organized? Central Nervous System Peripheral Nervous System and Endocrine System

Nervous System: Part IV The Central Nervous System The Brain

The Nervous System and the Endocrine System

Allen Independent School District Bundled LOTE Curriculum Beginning 2017 School Year ASL III

The origins of localization

TITLE:&The"organisation"of"working"memory"networks"is"shaped"by"early"sensory"experience"

EFFECT OF AGE AT IMPLANTATION ON AUDITORY-SKILL DEVELOPMENT IN INFANTS AND TODDLERS

LITERACY FOR DEAF CHILDREN: THE ESSENTIALS MARGARET HARRIS EMMANOUELA TERLEKTSI FIONA KYLE

Human Nervous System

Higher Cortical Function

Reminders. What s a Neuron? Animals at Birth. How are Neurons formed? Prenatal Neural Development. Week 28. Week 3 Week 4. Week 10.

The Biological Level of Analysis: Studying the Brain

M P---- Ph.D. Clinical Psychologist / Neuropsychologist

Human Cognitive Developmental Neuroscience. Jan 27

Association Cortex, Asymmetries, and Cortical Localization of Affective and Cognitive Functions. Michael E. Goldberg, M.D.

The neurolinguistic toolbox Jonathan R. Brennan. Introduction to Neurolinguistics, LSA2017 1

The human brain. of cognition need to make sense gives the structure of the brain (duh). ! What is the basic physiology of this organ?

Cerebral Cortex Structure, Function, Dysfunction Reading Ch 10 Waxman Dental Neuroanatomy Lecture. Suzanne Stensaas, Ph.D.

Influence of Auditory Experience on the Outcomes of Children with Hearing Aids: ACCESS Matters

DOES EARLY SIGN LANGUAGE INPUT MAKE A DIFFERENCE ON DEAF CHILDREN WITH AUDITORY BRAINSTEM IMPLANTS?

Transcription:

Bethel Sileshi Disparity of Non-verbal Language Learning Identical Rates of Acquisition, Neurological Adaptations, Cognitive Outcomes Science 100 Capstone, University of Alberta April 7, 2014

Introduction It is remarkable that infants can learn any language at the same rate, regardless of the language (Petitto & Marentette, 1991). This also applies to signed languages for example, American Sign Language, ASL, or Langue des Signes Québécoise, LSQ which are often used by deaf people. People born with severe (70 89 db loss) or profound (>90 db loss) hearing loss are referred to as deaf (Mayberry, 2002), and while they cannot hear conversational speech (approximately 60 db), they can still have a strong foundation in language. Language involves input, thought and output: a process of receiving, interpreting and responding in order to communicate, but language is not explicitly linked to sound. Children with hearing can distinguish speech sounds from other types of sound very soon after birth, but for deaf children, speech is inaudible so signed language is learned. As a result of the different forms of input, the process of language acquisition causes different neurological adaptations in the brains of hearing and deaf subjects (Neville and Mills, 1997). This neurological change creates different cognitive results in the deaf population: it hinders academic achievement, slightly affects IQ test performance and improves visuospatial abilities. After describing similarities in language acquisition between deaf and hearing subjects, and discussing the differences that have been studied, this report will determine the effects of deafness on cognitive development. Similarities in Language Learning Infants begin language acquisition with babbling at around 7-10 months. Although purposeful cries often indicate a desire or emotion (i.e. hungry, tired, lonely, etc), the spontaneous and meaningless vocalizations of babbling typically develop into words by ages 12 months and beyond. Deaf children exposed to signed languages from birth experience the same initial acquisition phase beginning with repetitive hand gestures known as manual babbling. It was unambiguously revealed that manual babbling page 1 of 11

is structurally identical to vocal babbling because it is spontaneous and meaningless, as well as possesses unique phonetic units with syllabic organization (Petitto & Marentette, 1991). Manual babbling is entirely different from the rhythmic hand movements of hearing and deaf babies, thus a separation between speech and language was discovered. In other words, language without vocalization can develop in the same stages as spoken language. This paradigm shift confirmed that babbling is a critical stage in language acquisition and led to the same linguistic milestones we observe in deaf and hearing children (Petitto & Marentette, 1991). After babbling, both hearing and deaf infants reached their first word stage (11-14 months) and first two word stage (16-22 months) in the same time frame and continued with similar grammatical and semantic developments (Petitto & Marentette, 1991). Differences in Language Learning Although deaf children learn sign language from deaf parents as naturally as hearing children learn spoken language from hearing parents, the type of language (vocal or visual) has different impacts on the brain. Studies of native ASL acquisition suggest that similar systems within the left hemisphere are involved in processing all natural languages. Areas within the right hemisphere are also associated with language processing when language acquisition relies on the perception of gestures and spatial location (Neville and Mills, 1997). These areas have been analysed by various methods including scalp topography, functional magnetic resonance imaging (fmri), and volumetric analysis. Scalp Topography Scalp topography is a way to study cortical somatosensory evoked potentials in the brain by employing electrodes that monitor nerve stimulation. One study, researching the development of neural page 2 of 11

systems in language processing, found different patterns of brain activity in normal, monolingual, righthanded adults as compared to congenitally deaf adults. In the experiment, 17 hearing subjects and 10 congenitally deaf subjects were presented with nouns and verbs while connected to an array of scalp electrodes. The resulting scalp topography identified areas of different brain activity (Figure 1). Left Hemisphere Right Hemisphere Figure 1. Scalp topography of current densities The top row shows an average of the results obtained from the hearing subjects, while the bottom row has the combined results of the deaf subjects. These maps have the same scale and, though they show little difference in the right hemisphere, there is a significant difference in the left hemisphere. On the maps, the longer wavelengths (purple, green, blue) represent current flowing out of the head while the shorter wavelengths (red, orange, yellow) represent current sinks. In the left hemisphere of the hearing subjects, there is a current sink over the anterior temporal region (shown by the black arrows) page 3 of 11

that is missing in the deaf subjects (Neville and Mills, 1997). This study further employed functional magnetic resonance imaging (fmri) to identify regions of difference implicated in language comprehension. Functional magnetic resonance imaging (fmri) To indirectly study neural activity, fmri is conducted to measure blood flow in the active brain. Figure 2 shows the increases in blood oxygenation of the cortical areas of the brain on fmri. There is significant activation of the left hemisphere when normal hearing adults read their native language, English (top). When deaf adults read English (middle) as their second language, there is much less activation in the left hemisphere. Conversely, there is greater activation in the same areas as observed in the hearing subjects, when Figure 2. Cortical areas showing blood oxygenation on fmri deaf subjects view sentences in their native ASL (bottom). This data suggests that neural systems mediate language and there are alterations in these neural systems as a consequence of deafness and the acquisition of sign language (Neville and Mills, 1997). Figure 3. Schematic of brain lobes and significant areas Comparing Figures 1 and 2 with Figure 3, it is noticeable that the greatest difference in the brains occurs around Broca s area, in the anterior temporal region. This region implicated in motor speechproduction was also found to play a role in the sequencing of actions (Nishitani, 2005). While language processing page 4 of 11

and sentence formation may be impaired in deaf people, their brains show Broca s area to be activated when they are presented with either written English or sign language sentences (Neville and Mills, 1997). Recent studies have shown that Broca s area plays a role in processing hand actions and facial gestures (Nishitani, 2005). While the exact mechanism of reorganization for visual as compared to auditory processing is not known, studies venture that sign language, in addition to lip reading may affect the organization of Broca s area (Leporé et al., 2010). Volumetric Analysis The functional reorganization that is known to occur throughout the brain can also be investigated using volumetric analysis of gray and white matter. White matter is the tissue, predominantly glia (non-neuronal cells), that transfers messages through gray matter: it informs us about the structural connections in the brain. Another study examining structural differences between 14 native signing deaf individuals and 16 matched hearing controls, hypothesized hypertrophy (non-tumorous enlargements) in Broca s area, as well as the motor and language cortices (Leporé et al., 2010). No significant volumetric differences were detected at the Figure 4. Volumetric differences in white and gray matter in the brain lobar level (Figure 4); however Broca s area, the cerebellum and the splenium (posterior end of the corpus callosum) were determined to be larger in deaf people. Broad regions of hypertrophy in the brains of deaf people was also discovered (Figure 5). page 5 of 11

As expected, Broca s area and other areas associated with motor and language cortices displayed hypertrophy. Since sign language is primarily observed by the visual cortex and travels to associated cortices for higher-level analysis, it is understandable that these areas would be enlarged. To produce language, the analyzed information is Figure 5. Regions of hypertrophy (red areas) in white matter processed in Wernicke s area (Figure 3) and the motor cortex is involved to produce a responsive sign (Leporé et al., 2010). The whole network interacts and relies on the splenium in the corpus callosum to transfer information between hemispheres: its increased role in transferring visual information is one theory for its enlargement. Cognitive Outcomes of Different Language Learning Though the reasons for neurological adaptations in the brains of hearing and deaf subjects are not clear, these adaptations cause variations in cognitive development. Cognitive development is the overarching term describing the complex and logical thought processes of developing children. It entails the multi-faceted ways a child matures and understands his or her environment (Mayberry, 2002). Comparing hearing and deaf children in academic achievement, performance on standardized intelligence tests, and visual spatial skills, will be the basis for identifying differences in cognitive outcomes. page 6 of 11

Academic Achievement Several US national studies provide updates on the performance of deaf students on standardized assessments, and these results are a common way to measure cognitive development. A recent national study, the Stanford Achievement Test Series (established by The Gallaudet Research Institute), found that in the test of mathematics about 80% of deaf students attained scores corresponding to the lowest quartile of the general population. In the areas of vocabulary and reading comprehension, less than 10% of deaf students were at or above grade level (Mitchell, 2009). The disparity between deaf students and their hearing peers suggests that children with hearing loss have difficulty in reading and mathematical areas of academic achievement. On average children with mild to moderate hearing loss achieve one to four grade levels lower than their normal hearing peers (Mitchell, 2009). Deaf high school students were on average 7.4 grades below grade level on tests of reading, and 5.4 grades below grade level on tests of mathematics. 7 In one test, the median math computation skills of 15- year-old deaf children in the US was at the 7th grade level while age-matched hearing children perform at a 10th grade level (Mitchell, 2009). According to these statistics, deafness alone does not impede a child s ability to learn and manipulate abstract symbols and symbolic relations in mathematics (Mitchell, 2009); while they scored poorly in comparison to hearing children, the gap in academic achievement is a matter of three grade levels, so there is progress present. However, as children progress through school it is expected that the gap in academic achievement between deaf and hearing children will increase. An example of this increase is in the median reading achievement of 17 21 year-old deaf students, which is at the 4th grade level (Mitchell, 2009). Deaf children have difficulty with visual representations of speech (i.e. written and read language). Other factors, including social class, ethnic background, and other handicapping conditions, were also found to impact academic achievement of deaf students in much the same way as hearing page 7 of 11

students (Mayberry, 2002). Deafness interacts with many other factors to determine academic success or failure, therefore it does not solely impair academic achievement and by extension cognitive development. IQ Test Performance The academic performance measured by an IQ test indicates levels of cognitive development. If the deaf population has a lower IQ than the hearing population, this would indicate that deafness limits cognitive development. From the office of Demographic Studies at Gallaudet University, 41 000 deaf students enrolled in special education in the United States were tested to determine a mean nonverbal IQ. The IQ for the general hearing population was 100 and the mean nonverbal IQ for deaf children with no additional handicap was also 100 (Mayberry, 2002). The equal IQ suggests that deafness does not impact IQ. In fact, over a span of 88 years mean non-verbal IQ performance was 97.4 on average and appeared to increase overtime (Mayberry, 2002). One explanation for the increase is the changing methods of testing which more recently use combinations of speech and signs during the administration of the test. This would account for the average increase. Although the mean nonverbal IQ of the deaf population equates that of the hearing population, some research suggests that there is a difference in the strengths and weaknesses of the two populations (Mayberry, 2002). This is understandable considering that the deaf participants would rely more heavily on visual acuity than hearing participants and hearing participants would have equal access to both audible and visual input. Visual-Spatial Skills Though deaf children perform adequately on language tasks in comparison to hearing children, they should perform better than hearing children on visual tasks as a result of their increased visual page 8 of 11

acuity. With this understanding, visual-spatial skills of deaf and hearing individuals should be greatly improved with the use of sign language. One activity testing visual-spatial skills that proved this understanding to be true was the Benton Test of Facial Recognition: participants presented with a photo of an individual are asked to match the face to one of six simultaneously-presented faces (Mayberry, 2002). All sign language users whether deaf or hearing performed more accurately; the advantage was increasingly apparent when the faces were shadowed and harder to perceive. In this test, the benefit of sign language influenced all users, even those who had only learned sign language a couple of years prior to taking the Benton Test of Facial Recognition. It took these participants less time and fewer trials to complete face matching tasks. They also had an increased accuracy of facial expression identification when compared to hearing participants (Mayberry, 2002). Individuals who use sign language are forced to communicate in the visual frame, so they are better at taking visual cues, for example the visual cues present in facial expression identification. With these results, it is clear that learning and using sign language will eventually sharpen visualspatial abilities. Practice using a visual frame in communication would make it easier to complete visual tasks, such as recognizing movement patterns and generating and rotating mental images. These skills provide a cognitive advantage to sign language users since both deaf and hearing signers were found to generate and rotate visual images more quickly than hearing non-signers (Mayberry, 2002). On the other hand, deaf individuals without an experienced knowledge of sign language do not show these effects. Conclusion Deaf children who cannot hear conversational speech (approximately 60 db) have difficulty mimicking, and thereby acquiring, the spoken language. The option of introducing sign language to a deaf child exists and can greatly improve language acquisition by providing a visual method of page 9 of 11

communication. However, studies of signed languages have found that despite similarities of language acquisition, significant reorganization and hypertrophy occurs in Broca s area and other areas of the brain associated with language. Comparing the development of deaf children to hearing children also determined that deaf children are most significantly impacted with respect to academic achievement and visual-spatial skills. While children with hearing loss have difficulty in reading and mathematical areas of academic achievement, their visual acuity is increased as a result of sign language. In fact, all sign language users whether deaf or hearing are benefitted by enhanced visual-spatial skills. The obstacles that deaf children face in the way of academic success and language development may be unique, but considering they display similar IQ performance there is the potential for eventual progress. The ability to focus on a visual frame of communication also suggests that the human brain is remarkably flexible and can adapt to a loss of one sensory input by allocating more resources to use other input (i.e. visual sources) at hand. Further research may lead to different approaches in the cognitive development of hearing impaired children that can improve their adult life. page 10 of 11

References *Figure 3: Kean M. 2008. The brain [Internet]. Office of Research [cited 2014 Feb 25]. Available from: http://wwwrohan.sdsu.edu/~gawron/intro/course_core/lectures/aphasia_cases_slides.html *Figure 1,2: Leporé N, Vachon P, Leporé F, Chou Y, Voss P, Brun C, Lee A, Toga A, Thompson P. 2010. 3D Mapping of Brain Differences in Native Signing Congenitally and Prelingually Deaf Subjects. Human Brain Mapping [Internet]. [cited 2014 March 25] 31:970 978. Available from: http://21cif.com/tools/citation/cse/citewizard_cse_1.0.html *Figure 4,5: Neville HJ, Mills DL. 1997. Epigenesis of Language. Mental Retardation and Developmental Disabilities Research Reviews [Internet]. [cited 2014 March 25] 3: 282 292. Available from: http://www.researchgate.net/publication/228416919_epigenesis _of_language Mayberry RI. 2002. Cognitive development in deaf children: the interface of language and perception in neuropsychology. Handbook of Neuropsychology (2) [Internet]. [cited 2014 Feb 19] 8. Available from: http://www.academia.edu/4946633/cognitive_development_in_deaf_children_the_interface_of_language_and_ perception_in_neuropsychology Nishitani N, Amunts K, Hari R. 2005. Broca s Region: From Action to Language. American Psychological Society [Internet]. [cited 2014 Mar 25] 20: 60-69. Available from: http://physiologyonline.physiology.org/content/20/1/60.full Petitto LA, Marentette P. 1991. Babbling in the manual mode: Evidence for the ontogeny of language. Science [Internet]. [cited 2014 Mar 28] 251:1483-1496. Available from: http://www.academia.edu/332984/petitto_l.a._and_marentette_p._1991_._babbling_in_the_manual_mode_evi dence_for_the_otogeny_of_language_reprinted_from_science_vol._252_pages_1483-1496 Vishton P. 2013. Effects That Language Has on Cognitive Development [Internet]. Demand Media, Inc. [cited 2014 Feb 25]. Available from: http://www.livestrong.com/article/93181-effects-language-cognitive-development/ *Figures extracted from reference page 11 of 11