Studying the time course of sensory substitution mechanisms (CSAIL, 2014)

Similar documents
Generalized learning of visual-to-auditory substitution in sighted individuals

Title of Thesis. Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential

Inattentional blindness for shapes, faces, and words: ERP correlates of attention & awareness

Pitts, M. & Hillyard, S. (2018). Still wanted: A reproducible demonstration of a genuine C1 attention effect. Cognitive Neuroscience, 9:1-2,

How do individuals with congenital blindness form a conscious representation of a world they have never seen? brain. deprived of sight?

Chapter 5. Summary and Conclusions! 131

Neural correlates of short-term perceptual learning in orientation discrimination indexed by event-related potentials

Neural Correlates of Auditory Attention in an Exogenous Orienting Task. A Thesis. Presented to

MULTI-CHANNEL COMMUNICATION

Sensory substitution, and the third kind of qualia*

Neural Correlates of Complex Tone Processing and Hemispheric Asymmetry

Beyond Blind Averaging: Analyzing Event-Related Brain Dynamics. Scott Makeig. sccn.ucsd.edu

Practicalities of EEG Measurement and Experiment Design YATING LIU

Seeing through the tongue: cross-modal plasticity in the congenitally blind

The impact of numeration on visual attention during a psychophysical task; An ERP study

An Overview of BMIs. Luca Rossini. Workshop on Brain Machine Interfaces for Space Applications

Figure 1. Source localization results for the No Go N2 component. (a) Dipole modeling

Supplementary information Detailed Materials and Methods

Mental representation of number in different numerical forms

Neural Networks: Tracing Cellular Pathways. Lauren Berryman Sunfest 2000

Tactile Auditory Shape Learning Engages the Lateral Occipital Complex

Supplementary Figure 1. Localization of face patches (a) Sagittal slice showing the location of fmri-identified face patches in one monkey targeted

The neurolinguistic toolbox Jonathan R. Brennan. Introduction to Neurolinguistics, LSA2017 1

The Central Nervous System

Rhythm and Rate: Perception and Physiology HST November Jennifer Melcher

ERP Components and the Application in Brain-Computer Interface

FINAL PROGRESS REPORT

A FRÖHLICH EFFECT IN MEMORY FOR AUDITORY PITCH: EFFECTS OF CUEING AND OF REPRESENTATIONAL GRAVITY. Timothy L. Hubbard 1 & Susan E.

Tracking the Development of Automaticity in Memory Search with Human Electrophysiology

Atypical processing of prosodic changes in natural speech stimuli in school-age children with Asperger syndrome

HST 583 fmri DATA ANALYSIS AND ACQUISITION

Reward prediction error signals associated with a modified time estimation task

Neurobiology Biomed 509 Sensory transduction References: Luo , ( ), , M4.1, M6.2

Transcranial direct current stimulation modulates shifts in global/local attention

Introduction to Computational Neuroscience

Congruency Effects with Dynamic Auditory Stimuli: Design Implications

Visual Context Dan O Shea Prof. Fei Fei Li, COS 598B

International Journal of Neurology Research

AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT

What do you notice? Woodman, Atten. Percept. Psychophys., 2010

Proceedings of Meetings on Acoustics

Conscious control of movements: increase of temporal precision in voluntarily delayed actions

Neurobiology of Hearing (Salamanca, 2012) Auditory Cortex (2) Prof. Xiaoqin Wang

Augmented Cognition to enhance human sensory awareness, cognitive functioning and psychic functioning: a research proposal in two phases

The effects of sensory deprivation on sensory processing. Ione Fine, University of Washington

Early posterior ERP components do not reflect the control of attentional shifts toward expected peripheral events

Brain Computer Interface. Mina Mikhail

Chapter 2 Test. 1. Evolutionary structures within the are the most primitive. *a. hindbrain b. thalamus c. forebrain d. midbrain e.

Supplemental Information: Task-specific transfer of perceptual learning across sensory modalities

Visual Evoked Potentials and Event Related Potentials in Congenitally Deaf Subjects

The effects of subthreshold synchrony on the perception of simultaneity. Ludwig-Maximilians-Universität Leopoldstr 13 D München/Munich, Germany

200 CHAPTER 5: TOPOGRAPHIC MAPPING OF VOICE CROSSMODAL PLASTICITY

CROSSMODAL PLASTICITY IN SPECIFIC AUDITORY CORTICES UNDERLIES VISUAL COMPENSATIONS IN THE DEAF "

Supporting Information

Higher Cortical Function

Research Article Looking into Task-Specific Activation Using a Prosthesis Substituting Vision with Audition

Active suppression after involuntary capture of attention

Entrainment of neuronal oscillations as a mechanism of attentional selection: intracranial human recordings

TMS Disruption of Time Encoding in Human Primary Visual Cortex Molly Bryan Beauchamp Lab

Supplementary Information on TMS/hd-EEG recordings: acquisition and preprocessing

Supplementary material

Clusters, Symbols and Cortical Topography

Electrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals

Brain and Cognition. Cognitive Neuroscience. If the brain were simple enough to understand, we would be too stupid to understand it

Report. Direct Recordings of Pitch Responses from Human Auditory Cortex

Tactile Communication of Speech

Figure 1. Excerpt of stimulus presentation paradigm for Study I.

PsychoBrain. 31 st January Dr Christos Pliatsikas. Lecturer in Psycholinguistics in Bi-/Multilinguals University of Reading

Neural Responses in Parietal and Occipital Areas in Response to Visual Events Are Modulated by Prior Multisensory Stimuli

Bioscience in the 21st century

Two-Point Threshold Experiment

Neural basis of auditory-induced shifts in visual time-order perception

Unit VIII Problem 9 Physiology: Hearing

Timing and Sequence of Brain Activity in Top-Down Control of Visual-Spatial Attention

[5]. Our research showed that two deafblind subjects using this system could control their voice pitch with as much accuracy as hearing children while

Classifying P300 Responses to Vowel Stimuli for Auditory Brain-Computer Interface

Supporting Information

Spatiotemporal brain dynamics underlying Attentional Bias Modifications. Etienne Sallard, Léa Hartmann, Lucas Spierer

Incorporating CAEP Testing in the Pediatric Clinic. Sarah Coulthurst, M.S. Alison J. Nachman, AuD Pediatric Audiologists

Analysis of in-vivo extracellular recordings. Ryan Morrill Bootcamp 9/10/2014

When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices

An ERP Examination of the Different Effects of Sleep Deprivation on Exogenously Cued and Endogenously Cued Attention

Experimental Design. Outline. Outline. A very simple experiment. Activation for movement versus rest

Introduction to Computational Neuroscience

MENTAL WORKLOAD AS A FUNCTION OF TRAFFIC DENSITY: COMPARISON OF PHYSIOLOGICAL, BEHAVIORAL, AND SUBJECTIVE INDICES

Myers Psychology for AP*

Students will be able to determine what stage of sleep someone is in by analyzing their EEG.

Mental State Sensing and the Goal of Circuit-Synapse Synergy

Gick et al.: JASA Express Letters DOI: / Published Online 17 March 2008

Exclusion criteria and outlier detection

Learning Objectives.

Event Related Potentials: Significant Lobe Areas and Wave Forms for Picture Visual Stimulus

Early Occurrence Of Auditory Change Detection In The Human Brain

Effects of acute ketamine infusion on visual working memory encoding: a study using ERPs

Excellent Network Courses. Department of Neurology Affiliated hospital of Jiangsu University

7. Sharp perception or vision 8. The process of transferring genetic material from one cell to another by a plasmid or bacteriophage

Over-representation of speech in older adults originates from early response in higher order auditory cortex

Material-speci c neural correlates of memory retrieval

Introduction to the Special Issue on Multimodality of Early Sensory Processing: Early Visual Maps Flexibly Encode Multimodal Space

Transcription:

Studying the time course of sensory substitution mechanisms (CSAIL, 2014) Christian Graulty, Orestis Papaioannou, Phoebe Bauer, Michael Pitts & Enriqueta Canseco-Gonzalez, Reed College. Funded by the Murdoch Charitable Program for Life Sciences and the Reed College Science Research Fellowship

Brain plasticity Changes in the brain due to: changes in behavior, learning, environment, brain injury or sensory deprivation Neville and others examined how the blind use of occipital cortex to process tactile information and the deaf use auditory cortex to process visual information. Paul Bach-y-Rita (as early as 1969) showed that congenitally blind individuals could recruit tactile sensory systems as a substitute for vision (i.e. sensory substitution ).

Neuroimaging studies Karns et al., (2012) measured the fmri signal in deaf and hearing individuals during a multisensory (visual and tactile) processing task. The Heschl s gyrus (primary auditory cortex) and the superior-temporal gyrus (a secondary area) were recruited in deaf individuals to process non-auditory and cross-modal stimuli. Kim & Zatorre (2006, 2011, 2012) trained intact individuals to use voice, a sensory substitution device, to access shape information via sound. Increased posttraining connectivity between primary auditory cortex and the lateral occipital complex (LOC), an area normally activated by shape processing (visual or tactile).

Our study Simple strategy: Compare the brain electrical activity elicited by auditory and visual stimuli before and after intact individuals are trained to extract visual shape information from auditory stimuli.

Methods Thirty-one participants (23 female, age 19-24) were randomly placed in the Meijer Algorithm group (N=16) or the Control group (N=15). Each subject participated in three 2-hour sessions on three consecutive days.

Overview of the three experimental sessions Pre-training Phase Training Phase Post-training Phase

Sample visual stimuli 180 complex shapes drawn using an alphabet of 13 basic shapes.

Meijer Image-to-Sound Conversion Algorithm Auditory stimuli were translated from the shapes using the Image-to-Sound Meijer algorithm: 1. The vertical dimension of the image is coded into frequencies between 500Hz-5000Hz, with higher spatial position corresponding to higher pitch. 2. The horizontal dimension is coded into a 500ms left-to-right panning of the sound.

Examples of the image-to-sound conversion

Both groups went through the same three types of training and transfer test (to be explained). Difference between groups: Participants in the Control group were presented with identical sounds and images to the exp. group, but the sound-image pairings were random.

Training protocol Passive training. Passive exposure to 80 simultaneous imagesound pairs. Twice each for a total of 15 min. 5-alternative forced choice task (with feedback). 240 trials randomly selected. Around 30-50 min. Matching task (with feedback). 80 familiar pairs plus 20 novel (instructed to use rules or guess according to each group). Transfer test (without feedback). Similar to matching task but no repetition of sound. 80 familiar and 80 novel sound-image pairs.

5-choice matching task

Matching task

Transfer test

Behavioral Results (effects of training)

Transfer Test Accuracy 90 P<.0001 80 % Correct 70 60 50 40 30 Significantly above chance Chance Familiar Stimuli Novel Stimuli 20 Control Meijer

Overview of the three experimental sessions Pre-training Phase Training Phase Post-training Phase

Triad Task (while recording EEG) Pre-training: matching only within-modality Post-training: Identical stimuli but now matching within or across modality

Triad Test Accuracy (only cross-modal trials) 80 P<.001 70 % Correct 60 50 40 Sig. above chance Chance Pre-training Post-training 30 20 Control Meijer

Summary behavioral findings Both groups learned (trained) sound-image pairings. However, the Meijer group outperformed the control group. The Meijer group was able to transfer their performance to new sound-image pairs. Post-training in the triad task, both groups identified sound-image pairs cross-modally. The Meijer group clearly outperformed the control group. What about their pre- vs. post training ERPs in the triad task?

Triad task reminder In the 1 st and 3 rd sessions we recorded EEG to identical auditory and visual stimuli

EEG recording 96 equidistant electrodes Average mastoid reference 500Hz sampling rate, 30Hz low-pass filter ERPs time-locked to the onset of 1 st stimulus (1 st sound) and 2 nd stimulus (1 st image).

ERP Results -Auditory- AFz

AFz -4µV Control Pd3-4µV * * Pd2 * AFz Meijer * Pd2 * Pd3-100 600ms -100 600ms +4µV Pd1 +4µV Pd1-4µV AFz -100 +4µV Difference Waves (Post minus Pre Training) * * * Pd1 Pd2 Pd3 600ms

Difference Maps (Post minus Pre Training) Pd1: 170-210ms Pd2: 270-310ms Pd3: 400-500ms Control Meijer -4µV AFz -100 +4µV Difference Waves (Post minus Pre Training) * * * Pd1 Pd2 Pd3 600ms

Auditory Results Summary Comparing ERPs elicited by sounds before and after training, we found Pd1 and Pd3 changes that were present in both groups. However, they were larger in the Meijer group. A Pd2 effect (270-310ms) was observed only in the Meijer group.

ERP Results -Visual- PO8

Control * * Meijer -4µV N1-4µV N1 PO8-100 600ms +4µV PO8-100 600ms +4µV -4µV PO8-100 +4µV Difference Waves (Post minus Pre Training) * N1 600ms Sign. for both, p<.05 Marginally larger for Meijer group, p=.06

Difference Maps (Post minus Pre Training) N1: 110-180ms PO8 Control -4µV * Difference Waves (Post minus Pre Training) N1 Meijer -100 +4µV 600ms

Visual Results Summary Comparing ERPs elicited by visual images before and after training show a larger N1 (110-180ms) in both groups, but this increase was larger for the Meijer group.

Conclusions Sensory substitution training enhances changes in electrophysiological activity that come about through cross-modal learning. Possible neural correlate of (visual-auditory) sensory substitution= auditory P2d (270-310 ms).

How long does Meijer training last? A year later we brought back: 5 participants from the Association group 6 participants from the Meijer group

Auditory Data Control Meijer AFz -4µV -4µV AFz -100 400ms -100 400ms +4µV Pre-Training Post-Training One Year Later +4µV Pre-Training Post-Training One Year Later

Visual Data Control Meijer PO8-4µV PO8-4µV -100 500ms -100 500ms +4µV Pre-Training Post-Training +4µV Pre-Training Post-Training One Year Later One Year Later

New design, ongoing study (preliminary data) Equate pre- vs. post-training tasks (uni-modal matches/mismatches). Include novel stimuli in pre- and post-training sessions (to control for exposure). Replicate our findings.

AFz Control 13 * Meijer 13-4µV Pd3-4µV * * Pd2 Auditory Data 2013 AFz j * Pd2 * Pd3-100 600ms -100 600ms +4µV Pd1 +4µV Pd1 Auditory Data 2014 Control 14 Meijer 14-5µV AFz -100 600ms +5µV -5µV AFz -100 600ms +5µV

-4µV * N1 Control 13 PO8 Visual Data 2013-4µV * N1 Meijer 13 PO8-100 600ms +4µV -100 600ms +4µV Visual Data 2014 PO8-5µV Control 14-5µV Meijer 14 PO8-100 600ms +5µV -100 600ms +5µV

Plans and outstanding questions Investigate timing differences (via mass univariate analysis). Source analysis How similar is this process to that observed in blind individuals?

THANK YOU