Title of Thesis. Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential

Size: px
Start display at page:

Download "Title of Thesis. Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential"

Transcription

1 Title of Thesis Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential 2014 September Yang Weiping The Graduate School of Natural Science and Technology (Doctor s Course) OKAYAMA UNIVERSITY

2

3 ABSTRACT Abstract In daily life, most external information is received from vision and sound signals. Vision and sound signals are received separately and integrated in the human brain and, thus, provide a comprehensive understanding of the real world. Therefore, it is important to study integration across sensory modalities. However, the neural mechanism of audiovisual integration is not completely clear. In this study, electroencephalography (EEG) with high temporal resolution was used to examine the audiovisual integration. We investigated the spatial auditory stimuli on human audiovisual integration. Our results showed that audiovisual integration was also elicited, even though auditory stimuli were presented behind the participant, but no integration occurred when auditory stimuli were presented in the right or left spaces, suggesting that the human brain might be particularly sensitive to information received from behind than both sides. Another important aspect is that aging gradually become a global issue. The effects of aging on audiovisual integration are of growing concern. We investigated the aging effects of audiovisual integration using behavioral analyzing applied race model and EEG. The race model results showed that the time window of audiovisual behavioral facilitation in elderly subjects was longer but more delayed than in the young subjects. The event-related potential (ERP) results also found that audiovisual integration in the elderly adults are later than those in the young adults. According to the current situation, future studies will focus on special populations such as older people, patients with headache, schizophrenia and so on. I hope to find the neural mechanism of audiovisual integration and to provide important basis for the early clinical detection and rehabilitation of brain disease. i

4 CONTENTS Contents Abstract Contents... i... ii Chapter 1 Introduction Audiovisual integration in the brain Audiovisual integration Related studies with audiovisual integration Race model Event-related potentials (ERPs) What is an electroencephalogram (EEG)? Event-related potentials (ERPs) Analysis method of ERPs data in the audiovisual integration study The purpose of the present dissertation The contents of the dissertation... 7 Chapter 2 Effects of auditory stimuli in the horizontal plane on audiovisual integration: an event-related potential study Background Methods Participants Stimuli and task Apparatus Data Analysis Results ii

5 CONTENTS Behavioral Results ERP Results Discussion Integration of auditory stimuli from the front Integration of auditory stimuli from the back Integration of auditory stimuli from the left and right spaces Conclusions Chapter 3 Age-related audiovisual integration elicited by peripherally presented audiovisual stimuli: a behavioral study Introduction Methods Subjects Experimental design Data analysis Results Hit rates Response times Race model comparisons Discussion Conclusion Chapter 4 Audiovisual integration elicited by stimuli peripherally in young and elderly adults: an event-related potential study Introduction Methods Subjects iii

6 CONTENTS Stimuli Task description Apparatus Data analysis Results Behavioral results Event Related Potentials (ERPs) Results Discussion Audiovisual in the young adults Audiovisual in the elderly adults Conclusion Chapter 5 General Conclusion and Future Challenges General conclusions Future challenges Appendix Ⅰ. Simple Introduction of EEG Apparatus Ⅱ. Individual behavior results Ⅲ. Analysis of behavioral data Publications Journal papers International conference papers Book chapters Acknowledgements References iv

7 CHAPTER 1 INTRODUCTION Chapter 1 Introduction Summary This chapter introduces the concept of audiovisual integration in the brain and related previous studies audiovisual integration, race model that is use reaction time to evaluate audiovisual integration, Electroencephalogram (EEG), and the method of event-related potential (ERP) analysis in audiovisual integration study. The aim and contents of the thesis are also briefly described. 1

8 CHAPTER 1 INTRODUCTION 1.1 Audiovisual integration in the brain Audiovisual integration Humans, like most other organisms, are equipped with multiple sensory channels through which to experience the environment. The information received from the different modalities must be localized and integrated by several systems to produce coherent cognition in the brain. In real life, 80% of information is received by the human brain from auditory and visual senses. Through integration, the brain would get a comprehensive recognition of the real world. For example, when you are talking with your friend, you usually hear your friend speaking as you see her or his mouth move. This process is referred to as audiovisual integration. Figure 1.1 The influence of visual processing on the perception of speech sounds, commonly known as the McGurk effect. When the speaker mouths the syllable /ga/, but the auditory stimulus is actually /ba/, subjects tend to hear /da/. A typical example of the audiovisual interaction is the McGurk effect, which was first described in a paper by McGurk and MacDonald in 1976 [1]. When a video of one phoneme production is dubbed onto a sound recording of a different phoneme that is spoken, the perceived phoneme is a third, intermediate phoneme. For example, a 2

9 CHAPTER 1 INTRODUCTION visual /ga/ combined with an audio /ba/ is often heard as /da/ (Figure 1.1). The McGurk effect demonstrates an interaction between hearing and vision in speech perception Related studies with audiovisual integration Many previous studies have been investigated multisensory audiovisual integration using behavioral and event-related potential measures in humans [2,3,4,5,6,7]. Behavioral results have been shown that responses to audiovisual stimuli are more rapid and accurate than the responses to either a unimodal visual or auditory stimulus. Previous neurological research showed that integration occurring during the first up to 200 ms post stimulus onset were mediated by early sensory-perceptual processing, whereas responses after 200ms was mediated by late cognitive processing [8,9]. Some researchers have shown that the facilitative effect of bimodal stimuli on signal detectability depend on the spatial location of the stimuli and that the modalities in the same location can produce the greatest response enhancements, whereas spatially disparate modalities induce a less potent effect or no change in response [10]. Moreover, previous studies based on visual discrimination task have also confirmed that the perception of visual stimuli can be improved by simultaneous auditory stimuli [11,12]. In these previous studies, audiovisual integration was studied using auditory and visual stimuli in various locations; however, these modalities were presented either directly in front of the participants, or to their left or right. Some recent studies have investigated audiovisual speech perception in which the range of locations for auditory stimuli and visual stimuli are extended to include those behind the participants (180ºfrom the point of fixation) (i.e., the entire horizontal plane) [13]. However, whether simple non-speech signals originating from behind participants can 3

10 CHAPTER 1 INTRODUCTION enhance the detection of visual stimuli remains unclear. The effects of aging on audiovisual integration are of growing concern. Some studies on the effects of aging have also shown response time facilitation in the bimodal audiovisual stimuli compared with the unimodal stimuli conditions in old and young adults [14,15]. Their results showed that the audiovisual in elderly individuals is greater than that in young individuals. On the contrary, slower reaction time and reduced reaction time facilitation effects were found in the elderly, and suppressed cortical also was showed [16]. Therefore, the previous studies provide conflicting views. Further studies are required to confirm these details and elucidate brain mechanism. Recently, numerous fmri and EEG/MEG studies have now shown that multisensory interplay can affect not only established multisensory convergence zones, but also brain areas and responses traditionally considered sensory specific [17,18,19,20,21,22]. However, despite investigating audiovisual integration in basic neurosciences, the exact neural mechanisms are not completely understood. 1.2 Race model To control for the redundant nature of the multisensory condition, response times were analyzed using cumulative distribution functions (CDFs), and the multisensory data were compared to statistical facilitation using a CDF of the summed probability of the visual and auditory responses. This model is often referred to as the independent race model [23,24]. This model analyzed the response times using cumulative distribution functions (CDFs), and audiovisual data were compared by evaluating the statistical facilitation using a CDF of the summed probability of the visual and auditory responses. To complete this analysis, the CDFs for each trial type were generated for each participant using 10ms time bins. Each participant s unimodal 4

11 CHAPTER 1 INTRODUCTION CDFs were used to calculate the race distribution using the following formula at each time bin: [P (A) + P (V)] [P (A) P (V)]; (A, auditory; V, visual; AV, audiovisual). If the probability of the response to the bimodal audiovisual stimulus was significantly greater than that predicted by the summed probability of the unimodal stimuli, audiovisual integration was considered to have occurred. 1.3 Event-related potentials (ERPs) What is an electroencephalogram (EEG)? Electroencephalography (EEG) is the standard noninvasive method to measure electrical potential changes arising from the brain. The EEG signal is recorded on the scalp surface and reflects the summed postsynaptic activity in the underlying cortical regions. A key advantage is the high temporal resolution of the method. Figure 1.2 shows recorded EEG data. EEG recordings show the overall activity of the millions of neurons in the brain. The recordings show fluctuations with time that are often rhythmic in the sense that they alternate regularly. The EEG patterns change when external stimuli (such as sounds or pictures) are presented. Figure 1.2 Recorded EEG data 5

12 CHAPTER 1 INTRODUCTION Event-related potentials (ERPs) The event-related potential (ERP) technique, in which neural responses to specific events are extracted from the EEG, provides a powerful noninvasive tool for exploring the human brain. The scientists use it to investigate cognitive processing. Event-related potentials (ERPs) are very small voltages generated in the brain structures in response to specific events or stimuli and ERP components are useful as measures of covert information processing. The averaging technique can be applied to EEG signals that get ERP components. Figure 1.3 showed that schematic of the ERP data average principle. Figure 1.3 Schematic of the ERP data average principle Analysis method of ERPs data in the audiovisual integration study In some event-related potential (ERP) studies, audiovisual (AV) integration was investigated by comparing the ERP from a bimodal AV stimulus with the sum of the ERPs of the constituent auditory (A) and visual (V) stimuli [25,26,27,28]. In my studies, the formula ERP (AV) [ERP (A) + ERP (V)] = ERP (A V 6

13 CHAPTER 1 INTRODUCTION interactions) was used to estimate cross-modal interaction or integration in the differences between the brain responses to bimodal stimuli and the algebraic sum of the unimodal responses. Moreover, the [AV (A+V)] complex was used to determine cortical regions that were uniquely activated by bimodal stimulation. The effects of AV integration were expected to be observed as differences between the AV and the (A+V) waveforms [AV (A+V)]. Figure 1.4 Analysis method of ERPs data in the audiovisual integration study 1.4 The purpose of the present dissertation The main aim of this thesis research was to investigate the brain activities of audiovisual integration using behavioral and electroencephalography (EEG) with high temporal resolution and to elucidate the mechanism of audiovisual integration in humans including healthy young and elderly. 1.5 The contents of the dissertation This dissertation mainly investigates brain mechanisms of audiovisual integration in healthy young and elderly. Chapter 1 introduces the concept of audiovisual integration in the brain, related previous studies, Race model, Electroencephalogram (EEG) analysis, and the method 7

14 CHAPTER 1 INTRODUCTION of event-related potential (ERP) analysis in audiovisual integration studies. The aim and contents of the thesis are also briefly described. Chapter 2 describes the first experiment. Spatial information, which visual and auditory stimuli come from different location, is an important factor in audiovisual integration studies. The present study investigates the effects of auditory stimuli in the horizontal plane on audiovisual integration using high density ERP recording and analyzes the neural bases of audiovisual integration in more detail. Chapter 3 describes the second experiment. The effects of aging on audiovisual integration are of growing concern. A visual and auditory discrimination task was designed that included visual, auditory, and audiovisual stimuli. Audiovisual integration was compared the between young and elderly adults using race model. Chapter 4 describes the third experiment. In this experiment, audiovisual integration elicited by stimuli peripherally in divided attention task was investigated using behavioral and electrophysiological measurements in young and old adults. Chapter 5 provides a general conclusion based on the findings of the four experiments and future challenges. 8

15 CHAPTER 2 EFFECTS OF AUDITORY STIMULI Chapter 2 Effects of auditory stimuli in the horizontal plane on audiovisual integration: an event-related potential study Summary This chapter aims to investigate whether auditory stimuli in the horizontal plane, particularly originating from behind the participant, affect audiovisual integration by using behavioral and event-related potential (ERP) measurements. In this study, visual stimuli, auditory stimuli, and audiovisual stimuli that include both visual stimuli and auditory stimuli originating from one of the four locations front (0, the fixation point), right (90 ), back (180 ), or left (270 ) were simultaneously presented. These stimuli were presented randomly with equal probability; during this time, participants were asked to attend to the visual stimulus and respond promptly only to visual target stimuli (a unimodal visual target stimulus and the visual target of the audiovisual stimulus). A significant facilitation of reaction times and hit rates was obtained following audiovisual stimulation, irrespective of whether the auditory stimuli were presented in the front or back of the participant. However, no significant interactions were found between visual stimuli and auditory stimuli from the right or left. Two main ERP components related to audiovisual integration were found: first, auditory stimuli from the front location produced an ERP reaction over the right temporal area and right occipital area at approximately milliseconds; second, auditory stimuli from the back produced a reaction over the parietal and occipital areas at approximately milliseconds. Our results confirmed that audiovisual integration was also elicited, even though auditory stimuli were presented behind the participant, but no integration occurred when auditory stimuli were presented in the 9

16 CHAPTER 2 EFFECTS OF AUDITORY STIMULI right or left spaces, suggesting that the human brain might be particularly sensitive to information received from behind than both sides. Keywords: Audiovisual Integration, Auditory Stimuli, Horizontal Plane 2.1 Background In a social environment, many objects and events are often perceived simultaneously via different sensory systems. The information received from the different modalities must be localized and integrated by several systems to produce coherent cognition in the brain. Previous studies have shown that bimodal audiovisual stimuli can be discriminated or detected more accurately and faster than unimodal auditory or visual stimuli presented alone [4,23,29,30]. This facilitative effect is called audiovisual integration. In previous studies, discrimination tasks were used to investigate audiovisual integration using target and standard stimuli, and subjects were required to respond only to target stimuli. Specifically, discrimination tasks involve visual (or auditory) detection methods in which subjects are instructed to respond to visual (or auditory) target stimuli while not responding to standard stimuli [31] or visual and auditory detection methods in which subjects are required to respond to both visual and auditory target stimuli [2,3]. In behavioral studies, using either visual and auditory detection [24,32] or one modality (i.e., visual stimuli) detection [33], some researchers have shown that the facilitative effect of bimodal stimuli on signal detectability depend on the spatial location of the stimuli and that the modalities in the same location can produce the greatest response enhancements, whereas spatially disparate modalities induce a less potent effect or no change in response [10]. Moreover, previous studies based on visual discrimination task have also confirmed that the perception of visual stimuli 10

17 CHAPTER 2 EFFECTS OF AUDITORY STIMULI can be improved by simultaneous auditory stimuli occurring in close spatial location to the visual stimuli [11,34]. Recently, some researchers investigated audiovisual integration elicited by a pair of visual and auditory stimuli presented in the frontal horizontal plane (0 from the fixation point) and right locations (up to 90 from the fixation point) [35]. Their findings indicated that the facilitative effects in localizing paired visual and auditory stimuli decrease from the central to peripheral locations and that this enhancement was no longer observed for the 90 right location. However, Stein, London, Wilkinson, and Price (1996) used a visual discrimination task to determine that the perception of peripheral visual stimuli (30 to the left or right of fixation) were unaffected by an auditory stimulus that was presented at the same location as the visual stimuli or at 45 to the right or left of the visual stimuli. When the visual stimuli were presented at the central fixation point, perception and detection of the visual stimuli were enhanced by simultaneous auditory stimuli, regardless of whether the auditory stimulus was spatially congruent or displaced 45 to the right or left of fixation [36]. Therefore, the positional relationship of modalities was an unimportant factor in determining whether multisensory interactions had occurred. In some event-related potential (ERP) studies, audiovisual (AV) integration was investigated by comparing the ERP from a bimodal AV stimulus with the sum of the ERPs of the constituent auditory (A) and visual (V) stimuli [25,26,27,28]. In these studies, the formula ERP (AV) [ERP (A) + ERP (V)] = ERP (A V interactions) was used to estimate cross-modal interaction or integration in the differences between the brain responses to bimodal stimuli and the algebraic sum of the unimodal responses. Moreover, the [AV (A + V)] complex was used to determine cortical regions that were uniquely activated by bimodal stimulation. The effects of AV integration were expected to be observed as differences between the AV and the (A + 11

18 CHAPTER 2 EFFECTS OF AUDITORY STIMULI V) waveforms [AV (A + V)]. Based on this method, Giard and Peronnet (1999) reported that integrative processes can occur as latencies as early as 40 milliseconds (ms) after stimulus onset over the posterior scalp areas when visual and auditory stimuli are simultaneously presented in the central location using a discrimination task. Furthermore, audiovisual integration of a spatial pair of visual and auditory stimuli was investigated using visual and auditory discrimination task with ERP recordings [5,37]. In studies by Teder-Sälejärvi et al. (2005), a pair of auditory and visual stimuli were presented either at the same spatial location (i.e., 30 to the left or right of the central fixation point) or at opposite spatial locations (i.e., V left and A right). Their findings showed that responses were faster for bimodal stimuli than for unimodal stimuli regardless of whether the stimuli were in the same or opposite spatial locations. However, Gondan et al. (2005) found that the differential integration between the same spatial location and the opposite spatial location was observed as early as 160 ms after stimulus onset over the parietal cortex using ERPs. Therefore, studies using high-density ERP recordings have revealed important insights about when (i.e., latency) and where (i.e., topography) audiovisual integration occurs in the human brain and indicated that the effect of integration was earlier in central, as opposed to peripheral, locations. These electrophysiological findings also confirmed that one modality can be affected by another modality, even when these modalities occur from incongruent spatial locations. In these previous studies, audiovisual integration was studied using auditory and visual stimuli in various locations; however, these modalities were presented either directly in front of the participants, or to their left or right. Some recent studies have investigated audiovisual speech perception in which the range of locations for 12

19 CHAPTER 2 EFFECTS OF AUDITORY STIMULI auditory stimuli and visual stimuli are extended to include those behind the participants (180 from the point of fixation) (i.e., the entire horizontal plane) [13]. Their results showed that the facilitation effect is stronger when auditory stimuli are presented behind the participants. These results indicated that audiovisual speech perception is impervious to spatial discordances due to specialized processing of more complex speech signals [38]. Moreover, Wyk et al. (2010) investigated the cortical integration of audiovisual speech and non-speech stimuli. Their findings demonstrated the differential activation of cortical networks involved in the integration of speech and non-speech stimuli [39]. However, whether simple non-speech signals originating from behind participants can enhance the detection of visual stimuli remains unclear. Second, whether simple non-speech signals originating 90 from the left or right of participants in the same horizontal plane as the behind signals can affect the detection of visual stimuli also remains unclear from electrophysiological evidence. The present study investigates the effects of auditory stimuli in the horizontal plane on audiovisual integration using high density ERP recording and analyzes the neural bases of audiovisual integration in more detail. We designed a simple visual discrimination task that included visual stimuli, auditory stimuli and audiovisual stimuli, which were randomly presented with equal probability. Visual stimuli consisted of Gabor gratings, which were presented directly in front of the participants. Auditory stimuli consisted of a 3000 Hz sinusoidal tone, which was presented at one location on an equidistant horizontal plane at the front (0, the fixation point), right (90 ), back (180 ), or left (270 ) of the participants. By comparing the audiovisual integration elicited by the visual stimulus presented at the front location with the auditory stimuli presented at the four horizontal plane locations, we examined whether audiovisual integration can be modulated according to the positions of the 13

20 CHAPTER 2 EFFECTS OF AUDITORY STIMULI auditory stimuli in the horizontal plane, particularly originating from behind the participants. 2.2 Methods Participants Fourteen healthy volunteers (ages years, mean age 22.5 years) participated in this study. All of the participants had normal or corrected-to-normal vision and normal hearing capabilities. The participants provided written informed consent for their participation in this study, which was previously approved by the ethics committee of Okayama University Stimuli and task The experiment was performed in a dimly lit, sound-attenuated and electrically shielded 2.0m 2.8m 2.3m room (Figure 2.1, laboratory room, Okayama University, Japan). Streams of unimodal visual, unimodal auditory, and bimodal audiovisual stimuli (auditory and visual components that occur simultaneously) were randomly presented. The unimodal visual stimuli consisted of two subtypes of standard and target stimuli. The visual standard stimulus was a Gabor patch with vertical gratings (2 diameter, spatial frequency = 1.6 cycles/degree). The visual target stimulus was a Gabor patch with horizontal gratings (2 diameter, spatial frequency = 1.6 cycles/degree). These visual stimuli were presented approximately 4 below the fixation point on a 21-inch computer monitor (128 background color) positioned 70 cm in the front of the subject s eyes. The visual stimulus was titrated for each subject so that detection required highly focused attention but could be performed at approximately 80% accuracy. This adjustment was accomplished by changing the contrast. The contrast of these ranged from 3% to 7% in units of Michelson contrast: 14

21 CHAPTER 2 EFFECTS OF AUDITORY STIMULI EEG room Figure 2.1 EEG Room Speaker Display1 Chin rest Button Subject Video driver Electrodes EEG Amp 32 Response EEG data Input Output BOX USB Adapter Trigger Controller Note PC PC Keyboard Figure 2.2 System of EEG device Display2 15

22 CHAPTER 2 EFFECTS OF AUDITORY STIMULI (max - min) / (max + min), with max and min being the maximal and minimal values of the Gabor patch. The unimodal auditory stimulus was a 3000 Hz sinusoidal tone, with a linear rise and fall time of 5 milliseconds and amplitude of 60 db. The auditory stimulus originated from one of four loudspeakers that were hidden by a black curtain. Four loudspeakers were positioned 80 cm from the front (0, the fixation point), right (90 ), back (180 ), and left (270 ) of the participants, at the level of the participant s ears. The bimodal audiovisual stimulus consisted of the simultaneous presentation of both the unimodal visual stimuli (standard or target) and the unimodal auditory stimuli (originating from one of the four locations). The target stimuli were presented at a frequency of approximately 19% of the total stimuli. The duration of each type of stimulus was 40 ms. The inter-stimulus interval (ISI) varied randomly between 800 ms and 1200 ms (mean ISI = 1000 ms). Eight experimental blocks were performed in this study, with each block lasting approximately 4 min. Each block consisted of 266 trials in total (i.e., visual ( ), auditory (24 4) and audiovisual ( )), in which the unimodal visual stimuli contained 24 standard stimuli and 10 target stimuli, unimodal auditory stimuli included 96 standard stimuli with 24 stimuli being presented at each location, and bimodal audiovisual stimuli contained 10 target stimuli and 24 standard stimuli when the auditory stimuli were presented at one of the four locations (in total, 40 target stimuli and 96 standard stimuli at four locations). These stimuli were randomly presented in each block. Each block had a 3000 ms fixation period followed by the test stimulus. During the fixation period, there was a cross (+) on the screen. During the response period that followed the test stimulus, the screen was clear. The experiment then continued with the next trial occurring at the set ISI time (

23 CHAPTER 2 EFFECTS OF AUDITORY STIMULI ms), regardless of whether the subject had responded to the target stimulus. Throughout the experiment, the subjects were required to fix their eyes on a centrally presented fixation point on a screen and to attend to the visual stimuli containing the unimodal visual stimulus and the visual segment of the audiovisual stimuli, while ignoring all auditory stimuli (Figure 2.3). The participants were instructed to respond to visual target stimuli by pressing the left button of a computer mouse with their right hand as quickly and accurately as possible. Figure 2.3 Experimental paradigm and stimuli. (A) Subjects sat approximately 70 cm from the screen. Visual stimuli were presented on the front screen. Auditory stimuli were randomly presented at one of four locations using speakers (front 0, right 90, back 180, and left 270 ). (B) Stimuli types Apparatus Stimulus presentation was controlled by a personal computer running Presentation software (Neurobehavioral Systems, Albany, CA). An electroencephalographic (EEG) 17

24 CHAPTER 2 EFFECTS OF AUDITORY STIMULI system (BrainAmp MR plus, Gilching, Germany) was used to record EEG signals through 32 electrodes mounted on an electrode cap (Easy-cap, Herrsching Breitbrunn, Germany) (Figure 2.2). All signals were referenced to the left and right earlobes. Horizontal eye movements were measured by deriving the electrooculogram (EOG) from one electrode placed at the outer canthi of the left eye. Vertical eye movements and eye blinks were detected by deriving an EOG from an electrode placed approximately one centimeter below the subject s left eye. The impedance was maintained below 5 kω. Raw signals were digitized using a sample frequency of 500 Hz with a 60 Hz notch filter, and all data were stored digitally for off-line analysis. The ERP analysis was carried out using Brain Vision Analyzer software (version 1.05, Brain Products GmbH, Munich, Bavaria, Germany) Data Analysis Behavioral Data Hit rates and response times to target stimuli were computed separately for each stimulus type. Hit rates were the number of correct responses to target stimuli divided by the total number of target stimuli. Response time data were analyzed for correct responses. The data for incorrect responses to standard stimuli were submitted using false alarm rates. We then used a z-transformed ratio to compute sensitivity (d ) and criterion (c) [40]. Behavioral results for all measures (response times, hit rates, false alarm rates, d and response criterion) were analyzed using a repeated-measures analysis of variance with the stimulus modalities (visual stimuli and audiovisual stimuli) as subject factors, and the alpha level was set at p < ERP Analysis The ERPs elicited by the standard stimuli were analyzed. The EEG and EOG signals were amplified and band-pass filtered with an analog filter of Hz at 18

25 CHAPTER 2 EFFECTS OF AUDITORY STIMULI a sampling rate of 500 Hz. EEG and EOG signals were divided into epochs from 100 ms before the stimulus onset to 600 ms after onset, and baseline corrections were made against ms. Trials with vertical eye movements and eye blinks (vertical EOG amplitudes exceeding ±100 μv), horizontal eyeball movements (horizontal EOG amplitudes exceeding ±25 μv), or other artifacts (a voltage exceeding ±80 μv at any electrode location relative to baseline) were excluded from the analysis. These trials were subjected to automatic rejection. Responses associated with false alarms were also rejected from the analysis. The data were then averaged for each stimulus type following digital filtering using a band-pass filter of Hz. The grand-averaged data were obtained across all participants for each stimulus type. Audiovisual integration was assessed by the difference wave [AV-(A+V)], obtained by subtracting the sum of the responses to the unimodal stimuli from the responses to the bimodal stimuli [41,42]. The logic of this additive model is that the ERPs to bimodal (AV) stimuli are equal to the sum of the unimodal (A+V) responses plus the putative neural activities specifically related to the bimodal nature of the stimuli. Mean amplitudes were calculated for all electrodes at consecutive windows of 20 ms each between stimulus onset and 500 ms after presentation of the stimulus. The mean amplitude data were analyzed using repeated-measures analysis of variance with the within-subjects factors of modality (AV, A+V), time-window and electrodes. The Greenhouse-Geisser Epsilon correction was applied to adjust the degrees of freedom of the F ratios as necessary. All statistical analyses were carried out using SPSS version 16.0 software package (SPSS, Tokyo, Japan). 2.3 Results Behavioral Results The average reaction times are shown in Figure 2.4A. Analysis of the factor 19

26 CHAPTER 2 EFFECTS OF AUDITORY STIMULI stimulus type confirmed that reaction times to unimodal visual and bimodal audiovisual stimuli (auditory stimuli from four locations) differed significantly [F (4, 52) = 8.079; p = 0.001]. In addition, post-hoc comparisons found that the reaction times to the bimodal audiovisual stimuli in which auditory stimuli were presented in the front (p = 0.01) and back (p = 0.031) of the participants were significantly faster than those to the unimodal visual stimuli. These results indicate that a synergistic effect occurred when the auditory stimuli were presented in the back space. However, the pairwise comparisons showed that no significant differences in reaction times were found between the bimodal audiovisual stimuli in which auditory stimuli were presented on the left (p = 0.333) or right (p = 0.066) of the participants and the unimodal visual stimuli. Figure 2.4 Behavioral mean data over all participants in the experiment. A F V, A B V, A L V, A R V: auditory stimuli of bimodal audiovisual stimuli were presented at the front, back, left, and right of the participant, respectively; V: unimodal visual stimuli. The hit rates showed similar effects as the response times (Figure 2.4B). These effects were statistically expressed as a main effect of the within-subject factor stimulus type [F (4, 52) = 7.317; p = 0.002]. The pairwise comparisons showed that 20

27 CHAPTER 2 EFFECTS OF AUDITORY STIMULI hit rates differed significantly between unimodal visual and bimodal audiovisual stimuli in which auditory stimuli were presented at the front (p = 0.02) or back (p = 0.027) of the participants (the responses to the bimodal audiovisual stimuli were faster), but no significant differences were found between unimodal visual stimuli and bimodal audiovisual stimuli in which auditory stimuli were presented on the left (p = 0.068) or right (p = 0.336) of the participants. However, regarding the false alarm rates, there was no significant effect of the factor stimulus type [F (4, 52) = 1.328, p = 0.28]. As Table 1 shows, the false alarm rates for stimulus type were low, particularly for unimodal visual stimuli, which produced the lowest false alarm rates. The hit and false alarm rates were then used to compute the signal detection measure d. There was no significant effect of stimulus type on d [F (4, 52) = 0.852, p = 0.466] (Table 1). These results indicate that if the hit rate to one stimulus type was high, false alarm rate of this stimulus type was also high. Thus, no significant difference was found between stimulus types for d. Response criterions are also shown in Table 1, which shows that a significant effect on response criterions were found for stimulus types [F (4, 52) = 6.143, p = 0.001]. Additionally, pairwise comparisons found that response criterions to the bimodal audiovisual stimuli in which auditory stimuli were presented at the front (p = 0.019) or back (p = 0.040) of the participants were lower than those to the unimodal visual stimuli ERP Results Event-related potentials: unimodal stimuli Figure 2.5 shows the group-averaged ERPs to unimodal visual stimuli and unimodal auditory stimuli at a subset of selected electrode sites for which the potentials were recorded at greater amplitudes. The visual ERP showed a prominent negative wave peaking at approximately 250 ms after stimuli onset (N250 component) 21

28 CHAPTER 2 EFFECTS OF AUDITORY STIMULI at occipital sites (peak: μv at O2) (Figure 2.5A). Figure 2.5 Waveforms and scalp topographies of unimodal stimuli. (A) Unimodal visual occipital N250 component. (B) Unimodal auditory N1 (auditory stimuli were presented at the front, back, left or right of the subjects). Note that the auditory N1 effect shows a clear contralateral enhancement (below). 22

29 CHAPTER 2 EFFECTS OF AUDITORY STIMULI For the auditory stimuli, which were presented at the front (0 ), back (180 ), left (270 ) and right (90 ) of the subjects, ERPs showed a prominent negative wave, and the negative N1 wave peaked at approximately 140 ms post-stimulus at the frontal sites (peak: -2.61μV, -2.96μV, μv, -4.19μV at Fz). Significantly, scalp topographies showed that the amplitude of the auditory N1 was greater over the hemisphere contralateral in response to stimulus presentation (Figure 2.5B, below). Event-related potentials: audiovisual integration Figure 2.6 shows the AV and (A+V) ERPs at several electrodes when auditory stimuli were presented at the front, back, left and right of the subjects. Several remarkable integration patterns were identified over the right temporal area and occipital area at approximately 160 to 200 ms when the auditory stimuli were presented in front of the subjects, and at the parietal and occipital areas at approximately 360 to 400 ms when the auditory stimuli were presented at the back of the subjects. These different integration patterns are analyzed in detail below. However, few or no significant integration patterns were found when the auditory stimuli were presented at the left or right of the subjects. Integration in right temporal and occipital areas (160 to 200 ms) When auditory stimuli were presented in front of the subjects, the observed effects were the most notable because the differences in amplitudes between AV and (A+V) were highly significant at the right temporal area and extended to the right occipital electrodes (P4, P8, CP6, Oz, O2) (Figure 2.6). 23

30 CHAPTER 2 EFFECTS OF AUDITORY STIMULI Figure 2.6 Grand-average event-related potentials elicited by audiovisual stimuli and 24

31 CHAPTER 2 EFFECTS OF AUDITORY STIMULI (auditory plus visual). Event-related potentials of the sum of the unimodal stimuli (A+V) and bimodal (AV) stimuli at a subset of electrodes are shown from 100 ms before the stimulus to 600 ms after. (A) Auditory stimuli from the front. (B) Auditory stimuli from the back. (C) Auditory stimuli from the left. (D) Auditory stimuli from the right. The square areas indicate the time periods when the bimodal response significantly differs from the sum of the unimodal responses (p < 0.01). (AFV, ABV, ALV, ARV: unimodal auditory stimuli of bimodal audiovisual stimuli were presented at the front, back, left and right of the subjects, respectively; AF, AB, AL, AR: unimodal auditory stimuli were presented at the front, back, left and right of the subjects, respectively). Analysis of these amplitudes showed that the main effects of the modality [F (1, 13) = 13.23, p < 0.01] and electrode [F (4, 52) = 10.33, p < 0.01]. However, no significant interaction between modality and electrode were found [F (4, 52) = 0.713, p = 0.532]. The difference in amplitudes between AV and (A+V) were apparent at approximately 140 ms and reached a statistical significance of 0.01 between 160 and 200 ms at P4 [F (1, 13) = 13.01, p = 0.005] (mean amplitude, AV-(A+V): 0.95 μv), P8 [F (1, 13) = 15.32, p = 0.001] (mean amplitude: 1.16 μv) and CP6 [F (1, 13) = 19.09, p = 0.003] (mean amplitude: 1.10 μv). The occipital differences between AV and (A+V) were confirmed at Oz [F (1, 13) = 10.13, p = 0.010] (mean amplitude: 0.95 μv) and O2 [F (1, 13) = 6.52, p = 0.029] (mean amplitude: 0.91 μv). However, from approximately 160 to 200 ms latency, no significant differences between AV and (A+V) were found at the right temporal and occipital areas when auditory stimuli were presented to the back [F (1, 13) = 0.09, p = 0.762], left [F (1, 13) = 2.21, p = 25

32 CHAPTER 2 EFFECTS OF AUDITORY STIMULI 1.169] and right [F (1, 13) = 3.46, p = 0.093] of the subjects (Figure 4). The topography of [AV-(A+V)] showing the effects of auditory stimuli from the four different locations is shown in Figure 2.7. The positive values in the [AV-(A+V)] differences at the right temporal and occipital areas were due to smaller amplitudes in the AV than in the (A+V) conditions. This result indicated that the amplitude of the response in auditory N1 was smaller for bimodal stimuli than for unimodal auditory stimuli. Integration of parietal and occipital areas (360 to 400 ms) When auditory stimuli were presented behind the subjects, at approximately 360 to 400 ms latency, significant differences in the amplitudes between the AV and (A+V) conditions [F (1, 13) = 8.32, p < 0.05] were found at several parietal and occipital electrodes (Pz, P4, Oz, O2) (Figure 2.4). Statistical significance of p < 0.01 was reached between 360 and 400 ms at P4 (mean amplitude, AV-(A+V): 1.02 μv) and also between 380 and 400 ms at Oz (mean amplitude: 0.98 μv) and O2 (mean amplitude: 1.09 μv). However, neither a significant effects of electrode type (Pz, P4, Oz, O2) nor significant interactions between electrode and modality were found. In the same manner, no significant corresponding patterns were observed in this latency window when the auditory stimuli were presented at the other three locations (front [F (1, 13) = 3.41, p = 0.094], left [F (1, 13) = 0.54, p = 0.479] and right [F (1, 13) = 2.71, p = 0.131]). In addition, the topographies of the integration effect in this latency window to auditory stimuli at the back of the subjects appeared to be stronger than when the auditory stimuli originated from the other three locations (Figure 2.7). 26

33 CHAPTER 2 EFFECTS OF AUDITORY STIMULI Figure 2.7 Topography of the different significant spatio-temporal patterns of integration. Audiovisual integration occurred (A) over the right temporal area at approximately ms when the auditory stimuli were located in front of the participants; (B) over the right occipital area at approximately ms when auditory stimuli were located in front of the participants; (C) over the right occipital area at approximately ms when auditory stimuli were located in back of the participants. (AF, AB, AL, AR: unimodal auditory stimuli were presented in the front, back, left and right of the subjects, respectively) 27

34 CHAPTER 2 EFFECTS OF AUDITORY STIMULI 2.4 Discussion The main focus of this study was to determine whether auditory stimuli in the horizontal plane affects audiovisual integration, particularly auditory stimuli presented at the back of subjects. Our behavioral and neural results demonstrated that integration occurred when the auditory stimuli were presented at the back and front of the participants. Two main event-related potential components related to audiovisual integration were found: first, auditory stimuli from the front location produced an ERP reaction over the right temporal area and right occipital area at approximately ms; second, auditory stimuli from the back produced a reaction over the parietal and occipital areas at approximately ms. No significant integration was found when auditory stimuli originated from the left or right spaces Integration of auditory stimuli from the front A major integration was found in the ms interval at the right temporal area that extended to the right occipital electrodes (Figures 2.6). Previous studies have reported audiovisual integration in which stimuli were presented in the frontal horizontal plane, and the observed reactions are closely analogous to this result [2,3,6,37]. However, the latency for centrally presented stimuli at ms occurred somewhat earlier than previously found [2]. This effect may have been due to attention, which has been shown to augment audiovisual integration processes, is that audiovisual integration under attended condition was greater than unattended condition [43,44,45]; additionally, and audiovisual integration depends on the stimuli being fully attended [46]. In the current study, only centrally presented visual stimuli were attended, unlike a study by Giard and Peronnet (1999), who investigated audiovisual integration under conditions of simultaneous attention to both modalities. In addition, a clear finding from our results is the strong predominance of the right 28

35 CHAPTER 2 EFFECTS OF AUDITORY STIMULI hemisphere in audiovisual integration. This result is in agreement with findings from previous studies, which showed that neural activation was greater in the right hemisphere [47,48]. However, in one study in which subjects received auditory, visual, and audiovisual letters of the roman alphabet and were required to identify them regardless of stimulus modality, the left posterior superior temporal sulcus (STS) showed prominent audiovisual integration for letters [49]. Moreover, previous studies on speech stimuli have found that the left hemisphere is responsible for most integration functions [50,51,52]. These findings indicated that the left hemispheric is involved in the integration of certain types of information. Therefore, different characteristic stimulus types have significant influences on audiovisual integration between hemispheres Integration of auditory stimuli from the back The most important, novel findings of the present study were the effects of auditory stimulation originating behind the participants (180 ) on visual perception. In this portion of the study, behavioral facilitation was significant (p < 0.05) (Figure 2.4), and activity in the parietal and occipital areas was identified at approximately ms after stimulus onset with audiovisual integration (Figure 2.7). These results were not in agreement with several previous studies [5,10,25]. In these studies, performance enhancement of simple detection tasks was reduced or absent with spatial separation of the stimuli. In these previous studies, visual and auditory stimuli were both presented in front of the participants and were separated by only 30 or 40. Additionally, other previous studies have investigated the perception of auditory stimuli in the horizontal and sagittal planes. Findings from these studies showed that the perception of spatial auditory stimuli can be affected by the distance between the auditory stimuli from the two ears and demonstrated that the contribution of the near 29

36 CHAPTER 2 EFFECTS OF AUDITORY STIMULI ear increases and, conversely, that of the far ear decreases [53,54]. Thus, when the auditory stimuli were presented in the median vertical plane (front 0 ; back 180 ), the distance of the auditory signal from the front and back space to the two ears were equal, suggesting that the signal reached both ears simultaneously. Thus, the auditory event was temporally congruent with the visual signal that was presented in front of the participant. Temporal importance was also confirmed in several studies on multisensory interaction or integration, which found that the interaction or integration was greater when the visual and auditory stimuli occurred simultaneously [55,56]. The results also demonstrated that spatial proximity of the two stimuli was critical for integration. Thus, the delayed response in the EEG signal to auditory signal from the back space might be due to spatial discordance between the visual and auditory stimuli. Furthermore, previous functional magnetic resonance imaging (fmri) studies also investigated the perception and detection of auditory stimuli in the horizontal directions. The behavioral data from these studies showed that no significant difference was found between the detection of auditory stimuli from the front and back spaces. However, their fmri results showed that the activity of the left posterior temporal gyri (pstg) was greater for back than front auditory stimuli [57]. These results indicate that integration was less efficient from the back space compared to the front space, potentially because a greater number of neural resources are needed to perform the same task. Nevertheless, when visual and auditory stimuli were simultaneously presented in front of the subjects, a spatially consistent sound might rapidly facilitate attention and perception to visual stimuli [5]. Therefore, in this study, audiovisual integration began later in the back space than in the front space according to the EEG signal, which was consistent with the behavioral results on back effects. In a similar manner, Zampini et al. (2007) studied auditory and somatosensory 30

37 CHAPTER 2 EFFECTS OF AUDITORY STIMULI stimuli pairings that were presented to either the same or different positions in the front and/or back of the participants. The results from the study showed that responses were significantly more rapid to bimodal auditory and somatosensory stimuli than to a unimodal modality, regardless of whether the stimuli were presented from the front or back spaces [58]. The findings indicated that auditory stimuli from the back could also improve the perception of the somatosensory stimuli to the same level as that achieved by auditory stimuli from the front space. These results, together with those of Jones et al. (2006) and the present study, confirm that the facilitation of behavioral responses occurred even though modalities were presented in the front and back locations. Thus, these results suggest that new neural brain activities might be activated when modalities are presented in the back and that integration could occur and does not completely depend on the spatial position of the stimuli. However, further electrophysiological studies are needed to confirm and elucidate these neural mechanisms. More specifically, studies on stimuli presented in the back space are often neglected in multisensory research Integration of auditory stimuli from the left and right spaces Our results showed that the contralateral auditory cortex was activated when unimodal auditory stimuli were presented in the left or right locations; this finding is consistent with previous studies [59]. However, no significant integration was found when visual stimuli were centrally presented and auditory stimuli were presented in the left (90 ) or right (90 ) location. Recently, behavioral studies have found that the facilitative effects of peripherally paired visual-auditory stimuli were less than those presented centrally (0 ) and have even found that no facilitation was produced by the right 90 location [35]. In this study, it was confirmed by ERP analysis that no significant integration occurred when auditory stimuli were presented from the lateral 31

38 CHAPTER 2 EFFECTS OF AUDITORY STIMULI locations. A possible reason for this result is that the two ears, which are in lateral symmetry on the head, have inter-aural differences, such as auditory stimuli originating from the right (90 ) side and thus different amounts of times were needed to inputs from both the left and right ears [60]. Thus, under this condition, the auditory signal is both temporally and spatially incongruent with the visual signal. Moreover, previous studies have found that the timing of auditory and visual stimuli has effects on integration [61] and also confirmed that integration was greater when the auditory stimuli were presented in close temporal and spatial proximity with the visual stimuli [55,56]. Thus, it had been suggested that temporal and spatial processing of the two signals were pivotal in determining whether integration would occur. These findings were consistent with our results. When a visual stimulus was presented at the front and an auditory stimulus was presented at the front or back, integration or interaction occurred; however, this integration was delayed when the auditory signal was presented in the back space. However, when the auditory information was presented at the sides (and the visual signal was presented at the front), integration was not observed. Further electrophysiological studies are needed to elucidate these neural mechanisms. 2.5 Conclusions This study shows the influences of auditory stimuli presented in the horizontal plane at the front (0, the fixation point), right (90 ), back (180 ), and left (270 ) of the participant on audiovisual integration processing. The data clearly showed that audiovisual integration occurred over the right temporal area and right occipital area at approximately ms latency when the auditory stimuli were located in front of the participant. However, no significant integration was found when auditory stimuli were presented in the right and left spaces. More specifically, when the 32

39 CHAPTER 2 EFFECTS OF AUDITORY STIMULI auditory stimuli were presented from the back, audiovisual integration was observed over the parietal and occipital areas at approximately ms. We believe that our findings, particularly regarding auditory stimuli originating from the back of participants, are likely to be a useful reference for further studies that investigate the integration mechanism when different modality stimuli are presented at different spatial positions. 33

40 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION Chapter 3 Age-related audiovisual integration elicited by peripherally presented audiovisual stimuli: a behavioral study Summary Although age-related multisensory integration (MSI) has previously been investigated, the effects of aging on MSI elicited by peripherally presented audiovisual (AV) stimuli remains unclear. In this study, visual, auditory and AV stimuli were randomly presented to the left or right of the central fixation point; during this time, subjects (young and old adults) were asked to promptly respond to target stimuli. Using a race model to analyze the response times, our results showed that the AV behavioral facilitation of young subjects was significant (p < 0.05), with response times ranging from 240 ms to 450 ms and peaking at 360 ms (14%). However, the AV behavioral facilitation of elderly subjects was delayed and showed an extensive range, with response times ranging from 260 ms to 540 ms and with a lower peak (12.6%) and delay time (390 ms). We found that the time window of AV behavioral facilitation in elderly subjects was longer but more delayed than in the young subjects when the AV stimuli were presented peripherally. This finding also further confirmed that peripheral resolution decreased with age. Keywords: Age-related, Audiovisual Integration, Peripherally, Race Model 34

41 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION 3.1 Introduction The study of age-related cognitive functions is becoming one of the scientific and social focal issues along with the prolonged life expectancy of human population. Previous studies have been found that age-related cognitive capacity decreases such as attention [62], advance motor preparation [63], or working memory [64]. In the real world, visual and auditory modalities are common forms of acquiring sensory information. Previous behavioral studies have shown that bimodal audiovisual (AV) stimuli can be discriminated or detected more rapidly and accurately than unimodal visual or auditory stimuli presented alone [2,4,6,29,30,37]. The effects of aging on multisensory integration (MSI) have gotten growing concern [65,66]. Some studies of aging effects have also showed response time facilitation in the bimodal AV stimuli compared to the unimodal stimuli conditions in old and young adults [14,15]. Furthermore, their results showed that the integration in elderly subjects is greater than that in young subjects when visual stimuli were presented centrally. These results indicated that multiple sensory channels could supplement the unimodal sensory deficits that are associated with aging and suggested that the increase in AV integration in elderly adults might be due to changes in multisensory processing. Curran and colleagues have studied the effects of aging on visuospatial attention using event-related potentials (ERPs) [67]. Their findings revealed that the responses to peripheral targets for the elderly subjects were slower than those for young subjects. Additionally, ERPs components were later for older adults than for young subjects. Such results indicate that there may be a decline in peripheral attention with age. Furthermore, Talsma and Woldorff (2005) reported that attention could modulate AV integration processes and integration was larger in the attended conditions than in the unattented conditions. Thus, we predicted that the integration 35

42 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION elicited by peripherally presented stimuli in elderly subjects would be delayed or weaker compared to that of young subjects. To confirm our predictions, we designed a visual and auditory discrimination task that included visual, auditory and AV stimuli, which were randomly presented with equal probability to the left or the right of the central fixation point. By comparing the integration between young and elderly adults, we investigate whether the integration of elderly subjects would be delayed or weaker than that of the young subjects. 3.2 Methods Subjects Fifteen elderly subjects (60-78 years of age, mean age: 68.6 years) and fifteen young subjects (22-28 years of age, mean age: 23.9 years) participated in this study. All of the participants had normal or corrected-to-normal vision (none of the subjects were color-blind) and normal hearing capabilities. The mini-mental state examination (MMSE) was used to evaluate the subjects cognitive function. The subjects were excluded if their MMSE score was greater than 2.5 standard deviations from the mean for individuals matched for their age and level of education [68]. In addition, the subjects were also excluded if they self-reported any disease. The participants provided written informed consent for their participation in this study, which was previously approved by the ethics committee of Okayama University Experimental design The experiment was performed in a dimly lit, sound-attenuated and electrically shielded room (laboratory room, Okayama University, Japan). Stimulus presentation and response collection were collected using Presentation software (Neurobehavioral Systems, Inc.). A stimulus stream consisting of unimodal visual, unimodal auditory and bimodal AV stimuli (auditory and visual components that occur simultaneously) 36

43 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION was randomly presented to the left or the right of the central fixation point. The visual target stimulus was a red and white block (5.2 cm 5.2 cm with a subtending visual angle of approximately 5 ) that was presented on a black background on a 21-inch computer monitor positioned 60 cm in the front of the subject s eyes (Figure 3.1). These visual stimuli were peripherally presented on either the left or right of the display at an angle of approximately 12 from a centrally presented fixation point in the lower visual fields (approximately 5 below the horizontal meridian) [43,69]. The auditory target stimulus was a white noise. The auditory stimuli were presented in either the left or right ear through an earphone. The AV target stimulus consisted of the simultaneous presentation of both visual and auditory target stimuli. The duration of each type of stimulus was 150 ms. The interstimulus interval (ISI) ranged between 1300 ms and 1800 ms. In addition, task-irrelevant stimuli were presented at a frequency of 20% of the total stimuli and were similar to the visual and auditory target stimuli; the task-irrelevant visual stimulus was a black and white block, and the task-irrelevant stimulus was a 1000 Hz sinusoidal tone, with a linear rise and fall time of 5 ms and an amplitude of 60 db. Figure 3.1 Examples of the stimulus sequences in the experiment. 37

44 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION Five experimental blocks were performed in this study, with each block lasting for approximately 5 min. Each block consisted of 300 visual stimuli, 300 auditory stimuli and 300 AV stimuli. All of the stimuli were randomly presented, and each stimulus had an equal probability of appearing to the left or the right of the central fixation point. Each block had a 3000 ms fixation period followed by the test stimulus. During the response period that followed the test stimulus, the screen was cleared. The experiment continued with the next trial at the set ISI time ( ms), regardless of whether the subject responded to the target stimulus. The participants were instructed to indicate whether a target appeared in the left or right hemispace by pressing the left or right key with the index or middle finger of their right hand as quickly and accurately as possible Data analysis Hit rates and response times of targets were computed separately for each stimulus type and each age group. Hit rates were the numbers of correct responses to target stimuli divided by a total of target stimuli, but correct or incorrect response to task-irrelevant stimuli were not computed. Response time data were analyzed for correct responses. Hit rates and response times were subjected to an analysis of variance (ANOVA). This analysis consisted of the within-subjects factor location (two levels: left or right hemispace) and the stimulus modality (three levels: visual, auditory or AV). In all of the analyses, the group (two levels: young or elderly subjects) served as the between-subjects factor. Although the ANOVA comparison for the response time identified responses to the bimodal AV stimuli that were faster than the responses to either the unimodal visual or auditory stimuli, this analysis only employed a single central tendency score (mean or median) to determine the response speed. To control for the redundant nature of the 38

45 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION AV stimuli conditions, an independent race model was adopted [23,24]. This model analyzed the response times using cumulative distribution functions (CDFs), and AV data were compared by evaluating the statistical facilitation using a CDF of the summed probability of the visual and auditory responses. To complete this analysis, the CDFs for each trial type were generated for each subject using 10 ms time bins. Each subject s unimodal CDFs were used to calculate the race distribution using the following formula at each time bin: [P (A) + P (V)] - [P (A) P (V)]. If the probability of the response to the bimodal AV stimulus was significantly greater than that predicted by the summed probability of the unimodal stimuli, AV integration was said to have occurred [23,24]. Each of the subject s race model curves was then subtracted from their AV CDFs for both young and elderly adults. A one-sample t-test was performed at each time bin within each group to determine whether this differential curve was significantly greater than zero. A two-sample t-test was performed to compare the differential curves of the young and old adults to evaluate significant group differences. 3.3 Results Hit rates The hit rates are shown in Table 3.1. The overall hit rates were greater than 93% for the visual, auditory and AV conditions in the young and old adults. No significant main effects or interactions were found using ANOVA among the factors of stimulus modality (visual, auditory and AV), location (left and right hemispace) and group (young and old adults) Response times The mean response times are shown in Table 3.1. A 2 group 3 modality 2 hemispace ANOVA that analyzed the response times to target stimuli shows the main 39

46 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION effects of the group [F (1, 28) = 9.73, p = 0.004], modality [F (2, 56) = 46.53, p < 0.001] and location factor [F (1, 28) = 6.68, p = 0.015]. These results indicate that the response times were significantly different within the two age groups (the elderly subjects responded more slowly) and modalities (the responses to the AV stimuli were faster). A 3 modality 2 hemispace ANOVA that compared the response times showed main effects of modality in the young [F (2, 28) = 47.54, p < 0.001] and old adults [F (2, 28) = 18.25, p < 0.001], but no main effects of hemispace were been found in the young [F (1, 14) = 1.15, p > 0.05] and old adults [F (1, 14) = 5.88, p > 0.05]. Table 3.1 The mean response times and hit rates for both young and old adults Standard deviations are given in parentheses. Young adults Old adults Left Right Left Right Response times (ms) Auditory 462 (128.4) 458 (123.0) 559 (126.7) 534 (116.7) Visual 457 (80.0) 440 (101.0) 527 (113.6) 515 (100.0) Audiovisual 383 (78.9) 389 (108.1) 452 (105.9) 441 (94.3) Hit rate Auditory 0.94 (0.03) 0.95 (0.04) 0.94 (0.04) 0.94 (0.03) Visual 0.95 (0.04) 0.94 (0.02) 0.93 (0.05) 0.93 (0.04) Audiovisual 0.96 (0.02) 0.95 (0.02) 0.93 (0.04) 0.94 (0.04) No differences were found between the left and right hemispaces in response times, so the response time data from the left and right hemispaces were combined. The average response times for young and old adults are illustrated in Figure 3.2. A 2 group 3 modality ANOVA that analyzed the response times showed main effects of 40

47 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION group [F (1, 28) = 9.73, p = 0.004] and modality [F (2, 56) = 46.53, p < 0.001]. In addition, a main effect of the factor stimulus modality confirmed that response times differed significantly from each other for both the young [F = 5.47, p = 0.008] and old adults [F = 7.21, p = 0.002]. For the young subjects, the post-hoc comparisons found that response times to the bimodal AV stimuli were significantly faster than those to the unimodal visual (p = 0.038) or auditory (p = 0.011) stimuli. For the elderly subjects, the post-hoc comparisons showed that the response times of the subjects were also faster in the bimodal AV stimuli than the unimodal visual (p = 0.029) or auditory (p = 0.002) stimuli. Figure 3.2 Average reaction time by modality and group (error bars show the SEM). (Y-A, Y-V, Y-AV: auditory, visual, audiovisual stimuli in the young subjects, respectively; O-A, O-V, O-AV: auditory, visual and audiovisual stimuli in the elderly subjects, respectively; *: p < 0.05) Race model comparisons The CDFs of the response time to the visual, auditory and AV stimuli are shown in Figure 3.3A (young subjects) and Figure 3.3B (elderly subjects). 41

48 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION Figure 3.3 Distributions of response times for target discrimination in young and elderly subjects. (A) Cumulative distribution functions (CDFs) for discrimination response times to auditory (A), visual (V), and audiovisual (AV) stimuli in young subjects. The summed probability of visual and auditory responses is depicted by the race model curve (Race model). Note that the audiovisual responses are typically faster than the race model prediction. (B) CDFs in elderly subjects. (C) The cumulative probability difference curve illustrates the behavioral enhancements under audiovisual conditions when compared to the race model prediction for the young (solid line) and the elderly (dotted line). 42

49 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION By comparing the CDFs, it was clear that the responses to the bimodal AV stimuli were faster than the responses to the unimodal visual or auditory stimuli in both of the age groups. Moreover, a portion of the responses to the bimodal AV stimuli was beyond that predicted by the statistical summation for the unimodal visual and unimodal auditory stimuli (Figure 3.3A, B), suggesting that AV integration occurred. To compare this relationship, we measured the response time to the AV stimuli by subtracting the race model that was implemented for each age group (Figure 3.3C). There was a significant (p < 0.05, one-sample T-test) facilitation between the response time to the AV stimuli and the prediction made by the race model in both of the age groups, illustrating behavioral enhancements observed under the AV conditions when compared to the race model prediction for the young and old adults. For the young subjects, AV behavioral facilitation was significant (p < 0.05) with response times ranging from 240 ms to 450 ms and peaking at 360 ms (14%), whereas the elderly subjects had a delayed but extensive, range from 260 ms to 540 ms and exhibited a peak at 390 ms (12.6%) (Figure 3.3C). 3.4 Discussion This study investigated the effects of age on MSI elicited by peripherally presented AV stimuli. We found that the time window of AV behavioral facilitation in the elderly subjects was longer but more delayed than that in young subjects. In this study, the time window of AV behavioral facilitation in the elderly subjects (from 260 ms to 540 ms) was longer than that in the young subjects (from 240 ms to 450 ms) (Figure 3.3C). These results were consistent with findings obtained from previous studies, which showed overall slowing of the response times in the elderly subjects compared to the young subjects [14]. 43

50 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION In this study, the time window of the AV behavioral facilitation in the elderly subjects (its peak was 12.6% at 390 ms) was more delayed and weaker than that in the young subjects (its peak was 14% at 360 ms) (Figure 3.3C). These results were inconsistent with those of previous studies [14,15,70]. In the previous studies, Laurienti et al. (2006) and Peiffer et al. (2007) found that elderly subjects was greater than the young subjects in the time window of the AV behavioral facilitation. However, Mahoney et al. (2011) showed that in young subjects, the time window of the AV behavioral facilitation was significantly greater compared to that of elderly subjects. Some possible reasons for these inconsistent results are as follows. Firstly, it was found that the location of the presented experimental stimuli was an important factor in the MSI study, as demonstrated by Laurienti et al. (2006) and Peiffer et al. (2007), who presented visual stimuli centrally, while Mahoney et al. (2011), presented bilateral visual stimuli peripherally. However, the experimental stimuli in our study were laterally and peripherally presented. In addition, it has been noted that peripheral resolution declines with age [71], indicating that the peripheral discrimination of the elderly subjects may be weaker than young subjects. Thus, we speculated that this might be a possible explanation for our observations the MSI in the elderly subjects was delayed and weaker than that in the young subjects. Secondly, several studies have shown an attentional cognitive function decline with aging [72,73,74]. Talsma and Woldorff (2005) reported that attention could modulate MSI processes in a divided-attention task. Thus, it is possible that the time of AV integration is delayed in elderly subjects. These results indicate that MSI is consistent with the generalized slowing that is thought to be associated with aging [75]. However, further studies are needed to confirm and elucidate those details. 44

51 CHAPTER 3 AGE-RELATED AUDIOVISUAL INTEGRATION 3.5 Conclusion In conclusion, we reported that integration elicited by peripherally presented audiovisual stimuli is age-related. We demonstrated that the time window of audiovisual behavioral facilitation in elderly subjects was longer and more delayed compared to that of young subjects when the audiovisual stimuli were presented peripherally. This finding also further confirmed that peripheral resolution decreases with age. 45

52 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL Chapter 4 Audiovisual integration elicited by stimuli peripherally in young and elderly adults: an event-related potential study Summary To investigate the neural mechanisms of audiovisual integration in young and elderly adults, we recorded event-related potentials (ERPs) during a divided attention task, in which the stimulus was presented in the auditory (A), visual (V), and in the audiovisual (AV) modalities. ERPs were recorded and ERPs elicited by the auditory and visual stimuli when presented alone were summed ( sum ERP) and compared to the ERP elicited when they were presented simultaneously ( simultaneous ERP). Two tones (20% white noise and 80% frequent 1000 Hz) and two visions (20% red white block and 80% black white block) were delivered. Behavioral data and ERPs of AV and (A + V) were analyzed according to different location. Reaction times (RTs) to the stimuli when presented simultaneously were significantly faster than when they were presented alone. For young adults, audiovisual integration was followed by three more phases of effects that were marked by scalp distribution: first, the right fronto-central area at ms after the presentation of the stimulus; second, centro-medial area at ms after the presentation of the stimulus; third, right posterior area at ms. For elderly adults, two periods of significant interactions were found: the right occipital area at ms and the left fronto-central area at ms after the presentation of the stimulus. We found that these interaction effects in the elderly adults are later than those in the young adults. Keywords: Audiovisual Integration, Peripherally, Aging 46

53 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL 4.1 Introduction Our brains are continuously inundated with stimulation arriving through our multisensory pathways. For example, we need to synthesize auditory (hearing the car engine), visual (seeing the road), and somatosensory (feeling the steering wheel) information when driving a car. Many previous studies have investigated the audiovisual integration using behavioral and event-related potential (ERP) measures. But one of the audiovisual stimuli is attended in great measure, such as audiovisual integration elicited in the visual attention condition [6], audiovisual integration enhance auditory detection in the auditory attention condition [30]. In addition, many previous studies of multisensory audiovisual (AV) integration have used primarily divided-attention tasks, in which both auditory and visual information were task relevant, and subjects required attention to auditory and visual stimuli [2,3,4,29]. Behavioral results have indicated that responses to AV stimuli are more rapid and accurate in a divided-attention task than are responses to either unimodal visual or unimodal auditory stimuli. Talsma and Woldorff have been reported that attention can modulate audiovisual integration processes in a divided-attention task [43]. Talsma et al. (2007) studied the interactions between attention and AV integration in a context in which auditory and visual stimuli were presented centrally. They reported that multisensory integration effects depended on the attention method; AV early integration occurred only in a divided-attention task. In a visual attention task in which only visual information was task relevant and required attention, the visual and auditory stimuli interacted with each other after 200 milliseconds of presentation of the stimulus (POS) [46]. Audiovisual integration remains unclear, which auditory and visual are attended synchronously in a divided-attention task. There is a paucity of research examining audiovisual integration across auditory and visual pairings in old adults. Laurienti et al. (2006) and Peiffer et al. (2007) have studied that audiovisual integration in older adults, studies showed that old adults is greater audiovisual integration than young adults. Such results indicate that multiple sensory channels could supplement unimodal sensory deficits with aging [14]. It is suggested that increase of audiovisual integration for old adults may be due to changes in multisensory processing [15]. Recently, Mahoney et al. (2011) investigate audiovisual integration across three multisensory pairings (auditory-somatosensory auditory-visual and visual-somatosensory), findings reveal that audiovisual 47

54 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL integration is significantly greater in young adults compared to old adults [70]. The result is in contrast to findings from Peiffer et al. (2007). These studies indicated different audiovisual integration processing in older adults are exhibited in the different paradigm condition. Although behavioral studies have investigated the audiovisual integration, no neuroimaging studies have been conducted to identify brain activities related to this phenomenon. In the present study, we design a discrimination task that unimodal visual stimuli, unimodal auditory stimuli and bimodal audiovisual stimuli are randomly presented left or right side of screen in order to ascertain audiovisual integration mechanism in young and elderly adults. 4.2 Methods Subjects Nine men healthy volunteers participated in the experiment (age 22 28, mean 24.2). Six healthy older adults volunteers (aged 60 to 78 years, mean age, 67.8 years) participated in the experiment. None of the subjects were color-blind. In addition, the subjects were also excluded if they self-reported any disease. The participants provided written informed consent for their participation in the study, which was previously approved by the ethics committee of Okayama University Stimuli The experiment contained three stimulus types which are unimodal visual (V) stimuli, unimodal auditory (A) stimuli and bimodal audiovisual stimuli (auditory and visual components occurring simultaneously). Each type had two subtypes of standard stimuli and target stimuli. The duration of each type of stimulus was 150 ms. The interstimulus interval (ISI) varied randomly between 1300 and 1800 ms. The unimodal visual standard stimuli consisted of black white block (5.2cm 5.2cm, subtending a visual angle of approximately 5 ) presented against a black background. These visual stimuli were presented to lateral locations on either the left or right of the display at an angle of about 12 from a centrally presented fixation point, in the lower visual fields (about 5 below the horizontal meridian), each with a duration of 150 ms. The unimodal auditory standard stimuli consisted of a 1000 Hz tone pip, with a total duration of 150 ms and linear rise and fall times of 5 ms, and an amplitude of 60 db. These stimuli were presented on either the left ear or right ear through earphone. The 48

55 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL bimodal audiovisual standard stimuli consisted of a combination of both auditory and visual features. The unimodal visual and auditory target stimuli, which were 20% of all stimuli, were similar to the unimodal visual and auditory standard stimuli, but visual stimulus was red white block and auditory stimulus was white noise. In brief, the present study contained 12 different stimulus categories (trial types) consisting of the combination of stimulus modality (three levels: unimodal visual, unimodal auditory, or bimodal audiovisual), stimulus identity (two levels: standards or targets), and presentation side (two levels: left or right). Figure 4.1 Examples of stimulus sequences on a single trial of the experiment. Task starts with a blank black screen, duration is 3000ms (a). A unimodal visual standard stimulus (black white block) is first presented on the left for 150 ms (b). Following stimulus presentation, the screen is cleared and the subject is allowed to response (c). The interstimulus interval (ISI) is ms. In this example a visual target stimulus (red white block) is presented on the right (d). The screen is cleared and the subject responds (e). The next stimulus shown is bimodal audiovisual (visual and auditory standard stimulus) on the right (f). The order of the stimulus presentation (unimodal visual, unimodal auditory and bimodal audiovisual) is randomized to limit habituation and predictability. 49

56 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL Task description Each block of trials lasted about 5 min. And each block consisted of 300 unimodal visual stimuli, 300 unimodal auditory stimuli, and 300 bimodal audiovisual stimuli. All stimuli were randomly presented, and each type of stimulus was presented with equal probability in the left and in the right hemispace. At the beginning of the experiment, participants were seated and given a description of the task, along with a number of practice blocks, and continued training until the experimenter was convinced that the participants understood and could perform the task. As shown in Figure 4.1, participants were instructed to attend to all the stimuli and were required to press the left button of a computer mouse when the target stimulus were presented on the left and press the right button of a computer mouse when the target stimulus were presented on the right with their right hand. To avoid movement artifacts, participants were further instructed to try to minimize blinking and making bodily movements and to fixate onto a centrally presented fixation dot. In addition, they were instructed to try to respond as quickly and accurately as possible Apparatus The visual stimuli were presented on a 21-inch computer display, located at a distance of 60 cm, directly in front of the participants eyes. Auditory stimuli were presented over earphone at the two sides of the computer screen. Stimulus presentation was controlled by a personal computer. Electroencephalographic (EEG) system (BrainAmp MR plus, Gilching, Germany) was used to record EEG signals through 32 electrodes mounted on an electrode cap (Easy cap, Herrsching Breitbrunn, Germany). All signals were referenced to the left and right mastoids. Horizontal eye movements were measured by deriving the electro-oculogram (EOG) from one electrode placed to the outer acanthi of the subjects eyes. Vertical eye movements and eye blinks were detected by deriving an EOG from electrode placed approximately one centimeter below the subjects left eye. The impedance was kept below 5 kω. Raw signals were digitized with a sample frequency of 500 Hz with a 60 Hz notch filter, and all data were stored digitally for off-line analysis. The event related potential (ERP) analysis was carried out using Brain Vision Analyzer software (version 1.05, Brain Products GmbH, Munich, Bavaria, Germany). 50

57 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL Data analysis Behavioral Results Analysis Reaction times (RTs) for correct detections of targets and hit rates (HR) were computed separately for all the stimuli types and hemispaces. These measures were analyzed using repeated measures analyses of variance (ANOVAs) with stimulus modality (unimodal visual, unimodal auditory, and bimodal audiovisual) and stimulus location (left hemispace, right hemispace) as the subject factors. ERP results analysis The ERPs elicited by the standard stimuli were analyzed. Continuous EEG signals were filters (band-pass filter of Hz), ocular correction, segmentation, baseline corrections (-100-0msec), and an artifact criterion of baseline ±80 mv was used to reject trials with noise transients at all channels. EEG signals were divided into epochs from 200 ms before the stimulus onset to 800 ms after onset. The grand-averaged data were obtained across all participants for each stimulus type. ERPs data from the left and right hemispace were combined to improve the signal-to-noise ratio of the ERPs. The mean amplitude of each electrode was calculated at consecutive windows of 20 ms between stimulus onset and 800 ms after presentation of the stimulus (POS). 4.3 Results Behavioral results Figure 4.2A presents the mean reaction times different stimuli in the young adults. A main effect of the factor stimulus type confirmed that response times to visual-only, auditory-only, and audiovisual stimuli differed significantly from each other [F(2, 18) = 3.1; p <.001]. Consistent with previous behavioral studies, subsequent planned comparisons showed that subjects responded faster to audiovisual stimulus targets than to visual ones [F(1, 9) = 77.1; p <.001]. In addition, responses to audiovisual stimuli were also significantly faster than responses to auditory stimuli [F(1, 9) = 83.8; p <.001]. Finally, no significantly differences were found between responses to visual stimuli and responses to auditory stimuli [F(1, 9) = 4.1; p =.073]. Hit rates were showed in the Figure 4.2B. No significantly differences were found about accuracy in the visual stimuli, auditory stimuli, and auditory stimuli. At the same time, neither significant main effects of location (left vs. right), nor significant interactions 51

58 Response Time (ms) Hit rate (%) Hit rate (%) CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL between location and stimulus type were found ** ** Left Right A V AV Modality (A) Response time A V AV A V AV (B) Hit rate Modality Left Right Left Right Figure 4.2 Response time and hit rate in the young adults. For the elderly adults, the mean reaction times and hit rates for each target type are shown in Figure 4.3A. A main effect of the factor stimulus type confirmed that response times to unimodal visual, unimodal auditory and bimodal audiovisual stimuli differed significantly from each other [F (2, 10) = ; p < 0.001]. Consistent with previous behavioral studies, subsequent planned comparisons showed that subjects responded faster to bimodal audiovisual stimulus targets than to unimodal visual ones (p = 0.04). In addition, responses to bimodal audiovisual stimuli were also significantly faster than responses to unimodal auditory stimuli (p = 0.002). A main effect of left and right hemispace were found [F (1, 5) = , p = 0.002], but no interaction was found between stimulus type and hemispace [F (2, 10) = 0.494, p = 0.557]. Hit rates were greater than 83% for the unimodal visual, unimodal auditory and bimodal audiovisual conditions. A main effect was found using an analysis of variance (ANOVA) among the factors of stimulus types [F (2, 10) = 5.126, p = 0.038]. The post-hoc tests revealed a significant difference between unimodal visual stimuli and unimodal auditory stimuli (p = 0.005), and also found a significant difference between unimodal visual stimuli and bimodal audiovisual stimuli (p = 0.04) Event Related Potentials (ERPs) Results Only the ERPs elicited by standard stimuli were analyzed. Continuous EEG signals were divided into epochs from 100ms before stimulus onset to 500ms after stimulus 52

59 Response time (ms) Hit rate (%) CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL onset. Multisensory integration processes were studied using a similar approach as the one taken by Giard and Peronnet (1999), derived originally from approaches used in the animal single-unit literature [4]. That is, ERPs from the unisensory auditory (A) and visual (V) stimuli were summed and compared with the ERPs elicited by audiovisual (AV) stimuli. Audiovisual integration was assessed in the differences between the responses to bimodal (AV) stimuli and the algebraic sum of the unisensory auditory and visual responses (A + V). Although the AV ERPs strongly resembled the (A+ V) waveforms, differences could be observed between these two curves. For the young adults, according to the recorded results, three periods of significant interactions were distinguished: (1) from 200 to 220 ms after stimulus onset, (2) from 260 to 320 ms, and (3) later significant effects from 340 to 440 ms. Figures 4.4, 4.5 shows the mean amplitude potential distributions of the AV, (A + V) and [AV - (A + V)] responses. 1) From 200 to 220 ms In this latency window, [AV - (A + V)] amplitudes were significant at several fronto-central electrodes (F3, F4, FC1, FC2, FC5, FC6, C3, C4) (Figure 4.5). The topography of AV response was stronger than to (A + V) response, so [AV - (A + V)] is positive potentials in the fronto-central sites from 200 to 220 ms (Figure 4.6A). The positive values in the [AV - (A + V)] differences at fronto-central sites were therefore due to larger amplitudes (in absolute values) of the positive potentials in AV than in (A + V) conditions ** * Left Right A V AV ** * Left Right A V AV (A) Response time (B) Hit rate Figure 4.3 Response time and hit rate in the elderly adults. *: p < 0.05; **: p <

60 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL Figure 4.4 Overlap of grand-averaged ERPs to AV stimuli and the sum of the ERPs to A and V stimuli at 30 electrodes in the young adults. Figure 4.5 The difference waveform of [ AV- (A+V) ] at 30 electrodes in the young adults. The significant differences between AV and (A + V) were found at a. front-central areas around ms; b. central areas around ms; and c. centro-parietal areas at around ms. 54

61 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL AV A+V AV-(A+V) (A) ms (B) ms (C) ms Figure 4.6 Topography of the different significant spatio-temporal patterns of interaction in the young adults. Each line displays the mean potential distributions of the bimodal (AV) responses, the sum of the unimodal responses (A + V), the interaction pattern quantified by the difference between the bimodal and the sum of the unimodal responses [AV - (A + V)]. A: ms; B: ms; C: ms. 55

62 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL 2) From 260 to 320 ms In this latency range, the greatest differences between AV and (A + V) were found at fourteen electrodes (F3, Fz, F4, FC1, FC2, FC5, FC6, Cz, C3, C4, CP1, CP2, CP5, CP6) covering a wide central scalp region between 260 and 320 ms (p <.001) (Figure 4.4). Several components could be distinguished by fine-grained analysis of the potential (Figure 4.5). The topography of each AV and (A + V) response displayed clear positive potentials over the central regions (Figure 4.6B). The negative values at central sites in the [AV - (A + V)] differences resulted from smaller amplitudes of the positive potentials in AV than in (A + V) conditions. 3) From 340 to 440 ms The effects observed in this latency window were the most prominent since the [AV - (A + V)] amplitudes were significant over 18 electrodes (Fz, F4, F8, FC1, FC2, FC5, FC6, T7, T8, C3, C4, Cz, CP1, CP2, CP6, Pz, P4, P8) between 340 and 440 ms (Figure 4.4). The topography of each AV and (A + V) response displayed clear positive potentials, but the potential of [AV - (A + V)] is negative values (Figure 4.6C). Moreover, the negative amplitudes of [AV - (A + V)] differences observed at central sites extended towards the right electrodes (F4, F8, CP2, CP6), and they had been reach statistical significance at these posterior sites (p <.001). It is worth noting that the negative amplitudes of [AV - (A + V)] were found at central sites extended towards the right posterior electrodes (Pz, P4, P8) (Figure 4.6C), and also they had been reach statistical significance at these posterior sites (p <.001). This suggests that this right posterior effect is due to new neural activities specifically elicited by the auditory and visual stimuli. For the elderly adults, ERPs from (A + V) and AV were showed in Figure 4.7. Although the AV ERPs strongly resembled the (A+ V) waveforms, differences could be observed between these two curves. These differences of [AV - (A + V)] amplitude at 30 electrode sites from -200 to 800 ms can be observed in Figure

63 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL -10uV -200ms +10uV 800ms Fp1 Fp2 F7 F3 Fz F4 F8 AV A+V FC5 FC1 FC2 FC6 T7 C3 Cz C4 T8 TP9 CP5 CP1 CP2 CP6 TP10 P7 P3 Pz P4 P8 O1 Oz O2 He Ve Figure 4.7 Overlap of grand-averaged ERPs to AV stimuli and the sum of the ERPs to A and V stimuli at 30 electrodes in the elderly adults. -10uV -200ms +10uV 800ms Fp1 Fp2 F7 F3 Fz F4 F8 AV-(A+V) FC5 FC1 FC2 FC6 T7 C3 Cz C4 T8 TP9 CP5 CP1 CP2 CP6 TP10 P7 P3 Pz P4 P8 O1 Oz O2 He Ve Figure 4.8 The difference waveform of [AV- (A+V)] at 30 electrodes in the elderly adults. 57

64 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL Figure 4.9 In the elderly adults, topographies of the AV (A + V) related to the multisensory integration: the right occipital area ms after the presentation of the stimulus and the left fronto-central area ms after the presentation of the stimulus. According to the recorded results, two periods of significant interactions were identified: the right occipital area at ms after the presentation of the stimulus and the left fronto-central area at ms after the presentation of the stimulus. 1) Event Related Potentials over the right occipital area ms after presentation of the stimulus In this latency window, [AV - (A + V)] amplitudes were significant at several the 58

65 P-value CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL right occipital electrodes (Figure 4.8). The topography of AV response was stronger than to (A + V) response, so [AV - (A + V)] was positive potentials in the right occipital sites from 160 to 200 ms (Figures 4.9, 4.10). The positive values in the [AV - (A + V)] differences at right occipital sites were therefore due to larger amplitudes (in absolute values) of the positive potentials in AV than in (A + V) conditions. 2) Event Related Potentials over the left fronto-central ms after presentation of the stimulus In this latency range, the greatest differences between AV and (A + V) were found at fourteen electrodes, covering a wide frontal and central scalp region between 300 and 600 ms (p <.01) (Figure 4.10). The topography of each AV and (A + V) response displayed clear positive potentials over the left fronto-central (Figure 4.9). Fp1 Fp2 F7 F3 Fz F4 F8 FC5 FC1 FC2 FC6 T7 C3 Cz C4 T8 TP9 CP5 CP1 CP2 CP6 TP10 P7 P3 Pz P4 P8 O1 Oz O2 Time (ms) Figure 4.10 Statistical significance of the multisensory integration between ms after the presentation of the stimulus in the elderly adults. 4.4 Discussion This study investigated the effects of age on audiovisual elicited by peripherally presented auditory and visual stimuli using event-related potential. We found that these audiovisual integration effects in the elderly adults are later than those in the 59

66 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL young adults Audiovisual in the young adults Our results showed that prominent effects of AV integration were also found at later stages in the divided attention. A. Effects at central sites The significant difference between AV and (A+V) was observed in the central area between 260 and 320 ms after stimulus onset (Figure 4.6). Several previous studies of audiovisual interactions have found analogous effects at the central area in which both auditory and visual stimuli were task relevant [37,46]. Even more interesting was that audiovisual interactions were very similar at around 280ms at the central area. This suggested that the different experimental instructions had no differential effect or very small affects around 280 ms at central area. However, further studies are needed to confirm and elucidate those details. In addition, the wide potential field and larger potentials expressed in [AV - (A + V)] differences between 260 and 320 ms could be due to audiovisual interactions in deep brain structures since effects in this latency window are better evidenced on scalp potential maps. Some animal studies have found integration effects in the superior colliculus [76]. Recent human experiments have also reported audiovisual interactions in the superior colliculus using functional magnetic resonance imaging (fmri) or ERPs when subjects either perceived auditory-visual stimuli passively [77]. B. Effects at right posterior sites In the present study, the negative amplitudes of [AV - (A + V)] differences observed at central sites extended towards the right electrodes (F4, F8, CP2, CP6). At the same time, negative amplitudes of [AV - (A + V)] were found at central sites extended towards the right posterior electrodes (Pz, P4, P8) (Figure 4.6C). However, no similar activation pattern could be observed in the left frontal electrodes. This result suggests that this audiovisual integration effect is likely to have contributions from additional brain areas that are activated when these stimuli are simultaneously attended. Regardless, the results show clear effects of attention on audiovisual integration brain activity. C. Audiovisual Integration after 200ms The significant effects in [AV - (A + V)] were observed after 200ms. Such late interactions have not been reported in the literature. Talsma et al reported that when 60

67 CHAPTER 4 AUDIOVISUAL INTEGRATION BY STIMUL attention was directed to both modalities simultaneously, audiovisual interaction occurred in early sensory processing, whereas when only one modality was attended, the interaction processes was delayed to late processing stage [46]. The relatively late effect of the auditory cue on this visual response could be due either to feedback connections from audiovisual areas to the visual cortex or to projections from the auditory cortex into the visual cortex. Such connections have been evidenced in the monkey [78,79] Audiovisual in the elderly adults In the elderly, the initial of audiovisual integration was observed over a right occipital area in scalp topography at approximately 160 to 200 ms after the presentation of the stimulus (Figure 4.9). This result resembles the finding of Talsma et al. (2006), who reported an audiovisual integration over occipital scalp N1 [80]. Mangun et al. also showed that the N1 is more explicitly related to an enhanced processing of the attended location, including the spatial properties of the attended stimulus [81]. An additional effect of audiovisual integration was found at the left fronto-central area at approximately 300 to 600 ms after the presentation of the stimulus (Figure 4.9). This effect has not previously been observed in studies on audiovisual integration. In elderly adults, cognitive ability regarding vision or sound decreases gradually with aging. The delayed audiovisual integration in elderly adults might be because of changes in multisensory processing. However, further studies are needed to elucidate the neural mechanisms of aging effect. 4.5 Conclusion In conclusion, our study provides evidence that age-related audiovisual integration elicited by peripherally presented auditory and visual stimuli using event-related potential. We found that the audiovisual integration in elderly subjects was longer and more delayed compared to that of young subjects when the audiovisual stimuli were presented peripherally. 61

68 CHAPTER 5 GENERAL CONCLUSION AND FUTURE CHALLENGES Chapter 5 General Conclusion and Future Challenges Summary This dissertation has investigated the mechanism of audiovisual integration in young and elderly adults by event-related potential experiments. In particular, we discussed the effects of the spatial characteristics of stimuli and sound frequencies and aging effects on audiovisual integration. The findings are summarized below. In addition, we describe challenges for future research. 62

69 CHAPTER 5 GENERAL CONCLUSION AND FUTURE CHALLENGES 5.1 General conclusions The dissertation is composed of three experiments. The first experiments investigated the spatial auditory stimuli on human audiovisual integration in the young adults. The second and third experiments investigated the aging effects in human audiovisual integration by behavioral and event-related potentials measures. Chapter 2 describes the first experiment, which measures auditory stimuli in the horizontal plane, particularly originating from behind the participant, affect audiovisual integration by using behavioral and event-related potential (ERP) measurements. A significant facilitation of reaction times and hit rates was obtained following audiovisual stimulation, irrespective of whether the auditory stimuli were presented in the front or back of the participant. However, no significant interactions were found between visual stimuli and auditory stimuli from the right or left. Two main ERP components related to audiovisual integration were found: first, auditory stimuli from the front location produced an ERP reaction over the right temporal area and right occipital area at approximately milliseconds; second, auditory stimuli from the back produced a reaction over the parietal and occipital areas at approximately milliseconds. Our results confirmed that audiovisual integration was also elicited, even though auditory stimuli were presented behind the participant, but no integration occurred when auditory stimuli were presented in the right or left spaces, suggesting that the human brain might be particularly sensitive to information received from behind than both sides. Chapter 3 describes the second experiment, in which we used an independent race model to investigate aging effects of multisensory interactions induced by audiovisual stimuli presented peripherally in a discrimination task. This model analyzed the response times using cumulative distribution functions (CDFs). Audiovisual data were compared by evaluating the statistical facilitation using the CDF of the summed probability of the visual and auditory responses. Our results showed that the AV behavioral facilitation of young subjects was significant (p < 0.05), with response times ranging from 240 ms to 450 ms and peaking at 360 ms (14%). However, the AV behavioral facilitation of elderly subjects was delayed and showed an extensive range, with response times ranging from 260 ms to 540 ms and with a lower peak (12.6%) and delay time (390 ms). We found that the time window of AV behavioral facilitation in elderly subjects was longer but more delayed than in the 63

70 CHAPTER 5 GENERAL CONCLUSION AND FUTURE CHALLENGES young subjects when the AV stimuli were presented peripherally. This finding also further confirmed that peripheral resolution decreased with age. Chapter 4 describes the third experiment. In this experiment, we recorded event-related potentials (ERPs) during a divided attention task in order to investigate the neural mechanisms of audiovisual integration in young and elderly adults. The stimulus was presented in the auditory (A), visual (V), and in the audiovisual (AV) modalities. ERPs elicited by the auditory and visual stimuli when presented alone were summed ( sum ERP) and compared to the ERP elicited when they were presented simultaneously ( simultaneous ERP). Behavioral results showed that reaction times (RTs) to the stimuli when presented simultaneously were significantly faster than when they were presented alone. For young adults, audiovisual integration was followed by three more phases of effects that were marked by scalp distribution: first, the right fronto-central area at ms after the presentation of the stimulus; second, centro-medial area at ms after the presentation of the stimulus; third, right posterior area at ms. For elderly adults, two periods of significant interactions were found: the right occipital area at ms and the left fronto-central area at ms after the presentation of the stimulus. We found that these interaction effects in the elderly adults are later than those in the young adults. 5.2 Future challenges Although the phenomenon of audiovisual integration is prevalent and important, the neural mechanism of audiovisual integration is not completely clear. Based on the previous and this study, the main factors that affect audiovisual integration have several simple rules: spatial rule (audiovisual integration tend to be enhanced when the visual and auditory stimuli are presented approximately the same location), temporal rule (audiovisual integration is greater when the visual and auditory stimuli are occurred at approximately the same time), stimulus properties (early audiovisual integration is appeared when at least one of the visual and auditory stimulus is by itself only weakly effective). These factors are investigated separately audiovisual integration, but it is not systematic discussed. In future studies, we aim to research systematically audiovisual integration process and clarify the mechanism of audiovisual integration. Another challenge is that audiovisual integration of special populations. We live in 64

71 CHAPTER 5 GENERAL CONCLUSION AND FUTURE CHALLENGES a multisensory world in which we are continuously overflowed with information input through multiple sensory pathways. Thus, brain mechanisms might be changed with the living environment in the older people and patients. Therefore, it is important to study integration across sensory modalities. In future studies, it is hoped that find the neural mechanism of audiovisual integration and to provide important basis for the early clinical detection and rehabilitation of brain disease. 65

72 APPENDIX Appendix Ⅰ. Simple Introduction of EEG Apparatus The BrainAmp MR plus was manufactured by BrainProduct Inc., Germany. This amplifier is a compact solution for neurophysiology research that can be combined with other units within the same product family to cover a vast range of possible application areas. This fully portable solution can be used for standard EEG/ERP recordings and can also be placed inside of the MRI bore for simultaneous EEG/fMRI acquisitions. Thanks to its 5 khz sampling rate per channel, the BrainAmp can be used to record EEG, EOG, and EMG signals as well as evoked potentials with a frequency up to 1 khz. The 16-bit TTL trigger input allows the detection of a large number of markers from visual, acoustic, electrical, magnetic or other stimulation modalities. The BrainAmps can be used both with passive and active electrodes offering a great degree of flexibility. The 32 channel units can be stacked to expand the number of channels up to 256 and combined with the BrainAmp ExG to record EEG, EOG, EMG, ECG, GSR (Galvanic Skin Response) and many other types of bipolar and auxiliary signals. Figure A1. EEG amplifier of BrainAmp MR plus 66

Early posterior ERP components do not reflect the control of attentional shifts toward expected peripheral events

Early posterior ERP components do not reflect the control of attentional shifts toward expected peripheral events Psychophysiology, 40 (2003), 827 831. Blackwell Publishing Inc. Printed in the USA. Copyright r 2003 Society for Psychophysiological Research BRIEF REPT Early posterior ERP components do not reflect the

More information

Dual Mechanisms for the Cross-Sensory Spread of Attention: How Much Do Learned Associations Matter?

Dual Mechanisms for the Cross-Sensory Spread of Attention: How Much Do Learned Associations Matter? Cerebral Cortex January 2010;20:109--120 doi:10.1093/cercor/bhp083 Advance Access publication April 24, 2009 Dual Mechanisms for the Cross-Sensory Spread of Attention: How Much Do Learned Associations

More information

Reward prediction error signals associated with a modified time estimation task

Reward prediction error signals associated with a modified time estimation task Psychophysiology, 44 (2007), 913 917. Blackwell Publishing Inc. Printed in the USA. Copyright r 2007 Society for Psychophysiological Research DOI: 10.1111/j.1469-8986.2007.00561.x BRIEF REPORT Reward prediction

More information

Language Speech. Speech is the preferred modality for language.

Language Speech. Speech is the preferred modality for language. Language Speech Speech is the preferred modality for language. Outer ear Collects sound waves. The configuration of the outer ear serves to amplify sound, particularly at 2000-5000 Hz, a frequency range

More information

The attentional selection of spatial and non-spatial attributes in touch: ERP evidence for parallel and independent processes

The attentional selection of spatial and non-spatial attributes in touch: ERP evidence for parallel and independent processes Biological Psychology 66 (2004) 1 20 The attentional selection of spatial and non-spatial attributes in touch: ERP evidence for parallel and independent processes Bettina Forster, Martin Eimer School of

More information

Figure 1. Source localization results for the No Go N2 component. (a) Dipole modeling

Figure 1. Source localization results for the No Go N2 component. (a) Dipole modeling Supplementary materials 1 Figure 1. Source localization results for the No Go N2 component. (a) Dipole modeling analyses placed the source of the No Go N2 component in the dorsal ACC, near the ACC source

More information

Transcranial direct current stimulation modulates shifts in global/local attention

Transcranial direct current stimulation modulates shifts in global/local attention University of New Mexico UNM Digital Repository Psychology ETDs Electronic Theses and Dissertations 2-9-2010 Transcranial direct current stimulation modulates shifts in global/local attention David B.

More information

Electrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals

Electrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals Electrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals Takako Mitsudo *1, Yoshitaka Nakajima 2, Gerard B. Remijn 3, Hiroshige Takeichi 4, Yoshinobu Goto

More information

The impact of numeration on visual attention during a psychophysical task; An ERP study

The impact of numeration on visual attention during a psychophysical task; An ERP study The impact of numeration on visual attention during a psychophysical task; An ERP study Armita Faghani Jadidi, Raheleh Davoodi, Mohammad Hassan Moradi Department of Biomedical Engineering Amirkabir University

More information

Supporting Information

Supporting Information Supporting Information ten Oever and Sack 10.1073/pnas.1517519112 SI Materials and Methods Experiment 1. Participants. A total of 20 participants (9 male; age range 18 32 y; mean age 25 y) participated

More information

Independence of Visual Awareness from the Scope of Attention: an Electrophysiological Study

Independence of Visual Awareness from the Scope of Attention: an Electrophysiological Study Cerebral Cortex March 2006;16:415-424 doi:10.1093/cercor/bhi121 Advance Access publication June 15, 2005 Independence of Visual Awareness from the Scope of Attention: an Electrophysiological Study Mika

More information

ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION

ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION AVSP 2 International Conference on Auditory-Visual Speech Processing ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION Lynne E. Bernstein, Curtis W. Ponton 2, Edward T. Auer, Jr. House Ear

More information

Electrophysiological Indices of Target and Distractor Processing in Visual Search

Electrophysiological Indices of Target and Distractor Processing in Visual Search Electrophysiological Indices of Target and Distractor Processing in Visual Search Clayton Hickey 1,2, Vincent Di Lollo 2, and John J. McDonald 2 Abstract & Attentional selection of a target presented among

More information

Studying the time course of sensory substitution mechanisms (CSAIL, 2014)

Studying the time course of sensory substitution mechanisms (CSAIL, 2014) Studying the time course of sensory substitution mechanisms (CSAIL, 2014) Christian Graulty, Orestis Papaioannou, Phoebe Bauer, Michael Pitts & Enriqueta Canseco-Gonzalez, Reed College. Funded by the Murdoch

More information

Mental representation of number in different numerical forms

Mental representation of number in different numerical forms Submitted to Current Biology Mental representation of number in different numerical forms Anna Plodowski, Rachel Swainson, Georgina M. Jackson, Chris Rorden and Stephen R. Jackson School of Psychology

More information

A study of the effect of auditory prime type on emotional facial expression recognition

A study of the effect of auditory prime type on emotional facial expression recognition RESEARCH ARTICLE A study of the effect of auditory prime type on emotional facial expression recognition Sameer Sethi 1 *, Dr. Simon Rigoulot 2, Dr. Marc D. Pell 3 1 Faculty of Science, McGill University,

More information

International Journal of Neurology Research

International Journal of Neurology Research International Journal of Neurology Research Online Submissions: http://www.ghrnet.org/index./ijnr/ doi:1.1755/j.issn.313-511.1..5 Int. J. of Neurology Res. 1 March (1): 1-55 ISSN 313-511 ORIGINAL ARTICLE

More information

Neural correlates of short-term perceptual learning in orientation discrimination indexed by event-related potentials

Neural correlates of short-term perceptual learning in orientation discrimination indexed by event-related potentials Chinese Science Bulletin 2007 Science in China Press Springer-Verlag Neural correlates of short-term perceptual learning in orientation discrimination indexed by event-related potentials SONG Yan 1, PENG

More information

Effects of Light Stimulus Frequency on Phase Characteristics of Brain Waves

Effects of Light Stimulus Frequency on Phase Characteristics of Brain Waves SICE Annual Conference 27 Sept. 17-2, 27, Kagawa University, Japan Effects of Light Stimulus Frequency on Phase Characteristics of Brain Waves Seiji Nishifuji 1, Kentaro Fujisaki 1 and Shogo Tanaka 1 1

More information

Multimodal interactions: visual-auditory

Multimodal interactions: visual-auditory 1 Multimodal interactions: visual-auditory Imagine that you are watching a game of tennis on television and someone accidentally mutes the sound. You will probably notice that following the game becomes

More information

MULTI-CHANNEL COMMUNICATION

MULTI-CHANNEL COMMUNICATION INTRODUCTION Research on the Deaf Brain is beginning to provide a new evidence base for policy and practice in relation to intervention with deaf children. This talk outlines the multi-channel nature of

More information

CHAPTER 6 INTERFERENCE CANCELLATION IN EEG SIGNAL

CHAPTER 6 INTERFERENCE CANCELLATION IN EEG SIGNAL 116 CHAPTER 6 INTERFERENCE CANCELLATION IN EEG SIGNAL 6.1 INTRODUCTION Electrical impulses generated by nerve firings in the brain pass through the head and represent the electroencephalogram (EEG). Electrical

More information

Event-Related Potentials Recorded during Human-Computer Interaction

Event-Related Potentials Recorded during Human-Computer Interaction Proceedings of the First International Conference on Complex Medical Engineering (CME2005) May 15-18, 2005, Takamatsu, Japan (Organized Session No. 20). Paper No. 150, pp. 715-719. Event-Related Potentials

More information

MENTAL WORKLOAD AS A FUNCTION OF TRAFFIC DENSITY: COMPARISON OF PHYSIOLOGICAL, BEHAVIORAL, AND SUBJECTIVE INDICES

MENTAL WORKLOAD AS A FUNCTION OF TRAFFIC DENSITY: COMPARISON OF PHYSIOLOGICAL, BEHAVIORAL, AND SUBJECTIVE INDICES MENTAL WORKLOAD AS A FUNCTION OF TRAFFIC DENSITY: COMPARISON OF PHYSIOLOGICAL, BEHAVIORAL, AND SUBJECTIVE INDICES Carryl L. Baldwin and Joseph T. Coyne Department of Psychology Old Dominion University

More information

Biomedical Research 2013; 24 (3): ISSN X

Biomedical Research 2013; 24 (3): ISSN X Biomedical Research 2013; 24 (3): 359-364 ISSN 0970-938X http://www.biomedres.info Investigating relative strengths and positions of electrical activity in the left and right hemispheres of the human brain

More information

Neural Correlates of Complex Tone Processing and Hemispheric Asymmetry

Neural Correlates of Complex Tone Processing and Hemispheric Asymmetry International Journal of Undergraduate Research and Creative Activities Volume 5 Article 3 June 2013 Neural Correlates of Complex Tone Processing and Hemispheric Asymmetry Whitney R. Arthur Central Washington

More information

Supporting Information

Supporting Information Supporting Information Forsyth et al. 10.1073/pnas.1509262112 SI Methods Inclusion Criteria. Participants were eligible for the study if they were between 18 and 30 y of age; were comfortable reading in

More information

Top-down control of audiovisual search by bimodal search templates

Top-down control of audiovisual search by bimodal search templates Psychophysiology, 50 (013), 996 1009. Wiley Periodicals, Inc. Printed in the USA. Copyright 013 Society for Psychophysiological Research DOI: 10.1111/psyp.1086 Top-down control of audiovisual search by

More information

Neural Correlates of Human Cognitive Function:

Neural Correlates of Human Cognitive Function: Neural Correlates of Human Cognitive Function: A Comparison of Electrophysiological and Other Neuroimaging Approaches Leun J. Otten Institute of Cognitive Neuroscience & Department of Psychology University

More information

Tracking the Development of Automaticity in Memory Search with Human Electrophysiology

Tracking the Development of Automaticity in Memory Search with Human Electrophysiology Tracking the Development of Automaticity in Memory Search with Human Electrophysiology Rui Cao (caorui.beilia@gmail.com) Thomas A. Busey (busey@indiana.edu) Robert M. Nosofsky (nosofsky@indiana.edu) Richard

More information

Multisensory Visual Auditory Object Recognition in Humans: a High-density Electrical Mapping Study

Multisensory Visual Auditory Object Recognition in Humans: a High-density Electrical Mapping Study Multisensory Visual Auditory Object Recognition in Humans: a High-density Electrical Mapping Study Multisensory object-recognition processes were investigated by examining the combined influence of visual

More information

Brain Computer Interface. Mina Mikhail

Brain Computer Interface. Mina Mikhail Brain Computer Interface Mina Mikhail minamohebn@gmail.com Introduction Ways for controlling computers Keyboard Mouse Voice Gestures Ways for communicating with people Talking Writing Gestures Problem

More information

Does Contralateral Delay Activity Reflect Working Memory Storage or the Current Focus of Spatial Attention within Visual Working Memory?

Does Contralateral Delay Activity Reflect Working Memory Storage or the Current Focus of Spatial Attention within Visual Working Memory? Does Contralateral Delay Activity Reflect Working Memory Storage or the Current Focus of Spatial Attention within Visual Working Memory? Nick Berggren and Martin Eimer Abstract During the retention of

More information

Active suppression after involuntary capture of attention

Active suppression after involuntary capture of attention Psychon Bull Rev (2013) 20:296 301 DOI 10.3758/s13423-012-0353-4 BRIEF REPORT Active suppression after involuntary capture of attention Risa Sawaki & Steven J. Luck Published online: 20 December 2012 #

More information

An ERP Examination of the Different Effects of Sleep Deprivation on Exogenously Cued and Endogenously Cued Attention

An ERP Examination of the Different Effects of Sleep Deprivation on Exogenously Cued and Endogenously Cued Attention Sleep Deprivation and Selective Attention An ERP Examination of the Different Effects of Sleep Deprivation on Exogenously Cued and Endogenously Cued Attention Logan T. Trujillo, PhD 1 ; Steve Kornguth,

More information

Does contralateral delay activity reflect working memory storage or the current focus of spatial attention within visual working memory?

Does contralateral delay activity reflect working memory storage or the current focus of spatial attention within visual working memory? Running Head: Visual Working Memory and the CDA Does contralateral delay activity reflect working memory storage or the current focus of spatial attention within visual working memory? Nick Berggren and

More information

The role of selective attention in visual awareness of stimulus features: Electrophysiological studies

The role of selective attention in visual awareness of stimulus features: Electrophysiological studies Cognitive, Affective, & Behavioral Neuroscience 2008, 8 (2), 195-210 doi: 10.3758/CABN.8.2.195 The role of selective attention in visual awareness of stimulus features: Electrophysiological studies MIKA

More information

Outline of Talk. Introduction to EEG and Event Related Potentials. Key points. My path to EEG

Outline of Talk. Introduction to EEG and Event Related Potentials. Key points. My path to EEG Outline of Talk Introduction to EEG and Event Related Potentials Shafali Spurling Jeste Assistant Professor in Psychiatry and Neurology UCLA Center for Autism Research and Treatment Basic definitions and

More information

Final Summary Project Title: Cognitive Workload During Prosthetic Use: A quantitative EEG outcome measure

Final Summary Project Title: Cognitive Workload During Prosthetic Use: A quantitative EEG outcome measure American Orthotic and Prosthetic Association (AOPA) Center for Orthotics and Prosthetics Leraning and Outcomes/Evidence-Based Practice (COPL) Final Summary 2-28-14 Project Title: Cognitive Workload During

More information

EEG in the ICU: Part I

EEG in the ICU: Part I EEG in the ICU: Part I Teneille E. Gofton July 2012 Objectives To outline the importance of EEG monitoring in the ICU To briefly review the neurophysiological basis of EEG To introduce formal EEG and subhairline

More information

Title of Thesis. Studies on Mechanisms of Visual and Auditory Attention in Temporal or Spatial Cueing Paradigm

Title of Thesis. Studies on Mechanisms of Visual and Auditory Attention in Temporal or Spatial Cueing Paradigm Title of Thesis Studies on Mechanisms of Visual and Auditory Attention in Temporal or Spatial Cueing Paradigm 2015 September Xiaoyu Tang The Graduate School of Natural Science and Technology (Doctor s

More information

The EEG Analysis of Auditory Emotional Stimuli Perception in TBI Patients with Different SCG Score

The EEG Analysis of Auditory Emotional Stimuli Perception in TBI Patients with Different SCG Score Open Journal of Modern Neurosurgery, 2014, 4, 81-96 Published Online April 2014 in SciRes. http://www.scirp.org/journal/ojmn http://dx.doi.org/10.4236/ojmn.2014.42017 The EEG Analysis of Auditory Emotional

More information

Provided by the author(s) and NUI Galway in accordance with publisher policies. Please cite the published version when available. Title Prefrontal cortex and the generation of oscillatory visual persistence

More information

An Overview of BMIs. Luca Rossini. Workshop on Brain Machine Interfaces for Space Applications

An Overview of BMIs. Luca Rossini. Workshop on Brain Machine Interfaces for Space Applications An Overview of BMIs Luca Rossini Workshop on Brain Machine Interfaces for Space Applications European Space Research and Technology Centre, European Space Agency Noordvijk, 30 th November 2009 Definition

More information

Open Access Neural Dynamics of Audiovisual Integration for Speech and Non-Speech Stimuli: A Psychophysical Study

Open Access Neural Dynamics of Audiovisual Integration for Speech and Non-Speech Stimuli: A Psychophysical Study Send Orders for Reprints to reprints@benthamscience.net The Open Neuroscience Journal, 03, 7, 5-8 5 Open Access Neural Dynamics of Audiovisual Integration for Speech and Non-Speech Stimuli: A Psychophysical

More information

Conscious control of movements: increase of temporal precision in voluntarily delayed actions

Conscious control of movements: increase of temporal precision in voluntarily delayed actions Acta Neurobiol. Exp. 2001, 61: 175-179 Conscious control of movements: increase of temporal precision in voluntarily delayed actions El bieta Szel¹g 1, Krystyna Rymarczyk 1 and Ernst Pöppel 2 1 Department

More information

Effect of intensity increment on P300 amplitude

Effect of intensity increment on P300 amplitude University of South Florida Scholar Commons Graduate Theses and Dissertations Graduate School 2004 Effect of intensity increment on P300 amplitude Tim Skinner University of South Florida Follow this and

More information

In the first section, Introduction, we present our experimental design.

In the first section, Introduction, we present our experimental design. Occipital and left temporal EEG correlates of phenomenal consciousness Abstract In the first section, Introduction, we present our experimental design. In the second section, we characterize the grand

More information

Supplementary materials for: Executive control processes underlying multi- item working memory

Supplementary materials for: Executive control processes underlying multi- item working memory Supplementary materials for: Executive control processes underlying multi- item working memory Antonio H. Lara & Jonathan D. Wallis Supplementary Figure 1 Supplementary Figure 1. Behavioral measures of

More information

Neural basis of auditory-induced shifts in visual time-order perception

Neural basis of auditory-induced shifts in visual time-order perception Neural basis of auditory-induced shifts in visual time-order perception John J McDonald 1, Wolfgang A Teder-Sälejärvi 2, Francesco Di Russo 3,4 & Steven A Hillyard 2 Attended objects are perceived to occur

More information

Design and ERP data: Basic concepts. Caitlin M. Hudac Graduate student, Developmental Brain Laboratory

Design and ERP data: Basic concepts. Caitlin M. Hudac Graduate student, Developmental Brain Laboratory Design and ERP data: Basic concepts Caitlin M. Hudac Graduate student, Developmental Brain Laboratory 1 Describe ERPs in terms of.. Peaks (positive or negative) Latency (post stimulus onset) Duration (e.g.,

More information

Dissociable neural correlates for familiarity and recollection during the encoding and retrieval of pictures

Dissociable neural correlates for familiarity and recollection during the encoding and retrieval of pictures Cognitive Brain Research 18 (2004) 255 272 Research report Dissociable neural correlates for familiarity and recollection during the encoding and retrieval of pictures Audrey Duarte a, *, Charan Ranganath

More information

Binaural Hearing. Why two ears? Definitions

Binaural Hearing. Why two ears? Definitions Binaural Hearing Why two ears? Locating sounds in space: acuity is poorer than in vision by up to two orders of magnitude, but extends in all directions. Role in alerting and orienting? Separating sound

More information

DATA MANAGEMENT & TYPES OF ANALYSES OFTEN USED. Dennis L. Molfese University of Nebraska - Lincoln

DATA MANAGEMENT & TYPES OF ANALYSES OFTEN USED. Dennis L. Molfese University of Nebraska - Lincoln DATA MANAGEMENT & TYPES OF ANALYSES OFTEN USED Dennis L. Molfese University of Nebraska - Lincoln 1 DATA MANAGEMENT Backups Storage Identification Analyses 2 Data Analysis Pre-processing Statistical Analysis

More information

EEG Analysis on Brain.fm (Focus)

EEG Analysis on Brain.fm (Focus) EEG Analysis on Brain.fm (Focus) Introduction 17 subjects were tested to measure effects of a Brain.fm focus session on cognition. With 4 additional subjects, we recorded EEG data during baseline and while

More information

Activation of brain mechanisms of attention switching as a function of auditory frequency change

Activation of brain mechanisms of attention switching as a function of auditory frequency change COGNITIVE NEUROSCIENCE Activation of brain mechanisms of attention switching as a function of auditory frequency change Elena Yago, MarõÂa Jose Corral and Carles Escera CA Neurodynamics Laboratory, Department

More information

CS/NEUR125 Brains, Minds, and Machines. Due: Friday, April 14

CS/NEUR125 Brains, Minds, and Machines. Due: Friday, April 14 CS/NEUR125 Brains, Minds, and Machines Assignment 5: Neural mechanisms of object-based attention Due: Friday, April 14 This Assignment is a guided reading of the 2014 paper, Neural Mechanisms of Object-Based

More information

AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT

AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT Y. Soeta, S. Uetani, and Y. Ando Graduate School of Science and Technology, Kobe

More information

Electroencephalography

Electroencephalography The electroencephalogram (EEG) is a measure of brain waves. It is a readily available test that provides evidence of how the brain functions over time. The EEG is used in the evaluation of brain disorders.

More information

Fundamentals of Cognitive Psychology, 3e by Ronald T. Kellogg Chapter 2. Multiple Choice

Fundamentals of Cognitive Psychology, 3e by Ronald T. Kellogg Chapter 2. Multiple Choice Multiple Choice 1. Which structure is not part of the visual pathway in the brain? a. occipital lobe b. optic chiasm c. lateral geniculate nucleus *d. frontal lobe Answer location: Visual Pathways 2. Which

More information

Neural Correlates of Auditory Attention in an Exogenous Orienting Task. A Thesis. Presented to

Neural Correlates of Auditory Attention in an Exogenous Orienting Task. A Thesis. Presented to Neural Correlates of Auditory Attention in an Exogenous Orienting Task A Thesis Presented to The Division of Philosophy, Religion, Psychology, and Linguistics Reed College In Partial Fulfillment of the

More information

Visual Selection and Attention

Visual Selection and Attention Visual Selection and Attention Retrieve Information Select what to observe No time to focus on every object Overt Selections Performed by eye movements Covert Selections Performed by visual attention 2

More information

Beyond Blind Averaging: Analyzing Event-Related Brain Dynamics. Scott Makeig. sccn.ucsd.edu

Beyond Blind Averaging: Analyzing Event-Related Brain Dynamics. Scott Makeig. sccn.ucsd.edu Beyond Blind Averaging: Analyzing Event-Related Brain Dynamics Scott Makeig Institute for Neural Computation University of California San Diego La Jolla CA sccn.ucsd.edu Talk given at the EEG/MEG course

More information

AccuScreen ABR Screener

AccuScreen ABR Screener AccuScreen ABR Screener Test Methods Doc no. 7-50-1015-EN/02 0459 Copyright notice No part of this Manual or program may be reproduced, stored in a retrieval system, or transmitted, in any form or by any

More information

The time-course of intermodal binding between seeing and hearing affective information

The time-course of intermodal binding between seeing and hearing affective information COGNITIVE NEUROSCIENCE The time-course of intermodal binding between seeing and hearing affective information Gilles Pourtois, 1, Beatrice de Gelder, 1,,CA Jean Vroomen, 1 Bruno Rossion and Marc Crommelinck

More information

Description of the Spectro-temporal unfolding of temporal orienting of attention.

Description of the Spectro-temporal unfolding of temporal orienting of attention. Description of the Spectro-temporal unfolding of temporal orienting of attention. All behaviors unfold over time; therefore, our ability to perceive and adapt our behavior according to the temporal constraints

More information

Action Preparation Helps and Hinders Perception of Action

Action Preparation Helps and Hinders Perception of Action Action Preparation Helps and Hinders Perception of Action Clare Press 1,2, Elena Gherri 1, Cecilia Heyes 3, and Martin Eimer 1 Abstract Several theories of the mechanisms linking perception and action

More information

fmri (functional MRI)

fmri (functional MRI) Lesion fmri (functional MRI) Electroencephalogram (EEG) Brainstem CT (computed tomography) Scan Medulla PET (positron emission tomography) Scan Reticular Formation MRI (magnetic resonance imaging) Thalamus

More information

An EEG/ERP study of efficient versus inefficient visual search

An EEG/ERP study of efficient versus inefficient visual search An EEG/ERP study of efficient versus inefficient visual search Steven Phillips (steve@ni.aist.go.jp) Neuroscience Research Institute (AIST), Tsukuba Central 2, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568

More information

Frank Tong. Department of Psychology Green Hall Princeton University Princeton, NJ 08544

Frank Tong. Department of Psychology Green Hall Princeton University Princeton, NJ 08544 Frank Tong Department of Psychology Green Hall Princeton University Princeton, NJ 08544 Office: Room 3-N-2B Telephone: 609-258-2652 Fax: 609-258-1113 Email: ftong@princeton.edu Graduate School Applicants

More information

The influence of predictive value of cues in the endogenous orienting paradigm examined with event-related lateralizations

The influence of predictive value of cues in the endogenous orienting paradigm examined with event-related lateralizations The influence of predictive value of cues in the endogenous orienting paradigm examined with event-related lateralizations Franka Roorda First supervisor: Rob van der Lubbe Second supervisor: Suzanne Vosslamber

More information

Seizure onset can be difficult to asses in scalp EEG. However, some tools can be used to increase the seizure onset activity over the EEG background:

Seizure onset can be difficult to asses in scalp EEG. However, some tools can be used to increase the seizure onset activity over the EEG background: This presentation was given during the Dianalund Summer School on EEG and Epilepsy, July 24, 2012. The main purpose of this introductory talk is to show the possibilities of improved seizure onset analysis

More information

When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices

When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices Neuroscience Letters xxx (2004) xxx xxx When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices F. Joassin a,, P. Maurage a, R. Bruyer a,

More information

Journal of Experimental Psychology: Human Perception and Performance

Journal of Experimental Psychology: Human Perception and Performance Journal of Experimental Psychology: Human Perception and Performance Item and Category-Based Attentional Control During Search for Real-World Objects: Can You Find the Pants Among the Pans? Rebecca Nako,

More information

Running head: HEARING-AIDS INDUCE PLASTICITY IN THE AUDITORY SYSTEM 1

Running head: HEARING-AIDS INDUCE PLASTICITY IN THE AUDITORY SYSTEM 1 Running head: HEARING-AIDS INDUCE PLASTICITY IN THE AUDITORY SYSTEM 1 Hearing-aids Induce Plasticity in the Auditory System: Perspectives From Three Research Designs and Personal Speculations About the

More information

ID# Final Exam PS325, Fall 1997

ID# Final Exam PS325, Fall 1997 ID# Final Exam PS325, Fall 1997 Good luck on this exam. Answer each question carefully and completely. Keep your eyes foveated on your own exam, as the Skidmore Honor Code is in effect (as always). Have

More information

Self-construal priming modulates visual activity underlying global/local perception

Self-construal priming modulates visual activity underlying global/local perception Biological Psychology 77 (2008) 93 97 Brief report Self-construal priming modulates visual activity underlying global/local perception Zhicheng Lin a,b, Yan Lin c, Shihui Han a,b, * a Department of Psychology,

More information

HST 583 fmri DATA ANALYSIS AND ACQUISITION

HST 583 fmri DATA ANALYSIS AND ACQUISITION HST 583 fmri DATA ANALYSIS AND ACQUISITION Neural Signal Processing for Functional Neuroimaging Neuroscience Statistics Research Laboratory Massachusetts General Hospital Harvard Medical School/MIT Division

More information

Event Related Potentials: Significant Lobe Areas and Wave Forms for Picture Visual Stimulus

Event Related Potentials: Significant Lobe Areas and Wave Forms for Picture Visual Stimulus Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology ISSN 2320 088X IMPACT FACTOR: 6.017 IJCSMC,

More information

Timing and Sequence of Brain Activity in Top-Down Control of Visual-Spatial Attention

Timing and Sequence of Brain Activity in Top-Down Control of Visual-Spatial Attention Timing and Sequence of Brain Activity in Top-Down Control of Visual-Spatial Attention Tineke Grent- t-jong 1,2, Marty G. Woldorff 1,3* PLoS BIOLOGY 1 Center for Cognitive Neuroscience, Duke University,

More information

Tilburg University. Published in: Journal of Cognitive Neuroscience. Publication date: Link to publication

Tilburg University. Published in: Journal of Cognitive Neuroscience. Publication date: Link to publication Tilburg University Time-course of early visual extrastriate activity in a blindsight patient using event related potentials. Abstract Pourtois, G.R.C.; de Gelder, Beatrice; Rossion, B.; Weiskrantz, L.

More information

Neural Strategies for Selective Attention Distinguish Fast-Action Video Game Players

Neural Strategies for Selective Attention Distinguish Fast-Action Video Game Players Neural Strategies for Selective Attention Distinguish Fast-Action Video Game Players - Online First - Springer 7/6/1 1:4 PM Neural Strategies for Selective Attention Distinguish Fast-Action Video Game

More information

Manuscript under review for Psychological Science. Direct Electrophysiological Measurement of Attentional Templates in Visual Working Memory

Manuscript under review for Psychological Science. Direct Electrophysiological Measurement of Attentional Templates in Visual Working Memory Direct Electrophysiological Measurement of Attentional Templates in Visual Working Memory Journal: Psychological Science Manuscript ID: PSCI-0-0.R Manuscript Type: Short report Date Submitted by the Author:

More information

Introduction to Electrophysiology

Introduction to Electrophysiology Introduction to Electrophysiology Dr. Kwangyeol Baek Martinos Center for Biomedical Imaging Massachusetts General Hospital Harvard Medical School 2018-05-31s Contents Principles in Electrophysiology Techniques

More information

Carnegie Mellon University Annual Progress Report: 2011 Formula Grant

Carnegie Mellon University Annual Progress Report: 2011 Formula Grant Carnegie Mellon University Annual Progress Report: 2011 Formula Grant Reporting Period January 1, 2012 June 30, 2012 Formula Grant Overview The Carnegie Mellon University received $943,032 in formula funds

More information

Material-speci c neural correlates of memory retrieval

Material-speci c neural correlates of memory retrieval BRAIN IMAGING Material-speci c neural correlates of memory retrieval Yee Y. Yick and Edward L. Wilding Cardi University Brain Research Imaging Centre, School of Psychology, Cardi University, Cardi, Wales,

More information

Attention modulates the processing of emotional expression triggered by foveal faces

Attention modulates the processing of emotional expression triggered by foveal faces Neuroscience Letters xxx (2005) xxx xxx Attention modulates the processing of emotional expression triggered by foveal faces Amanda Holmes a,, Monika Kiss b, Martin Eimer b a School of Human and Life Sciences,

More information

Simultaneous Acquisition of EEG and NIRS during Cognitive Tasks for an Open Access Dataset. Data Acquisition

Simultaneous Acquisition of EEG and NIRS during Cognitive Tasks for an Open Access Dataset. Data Acquisition Simultaneous Acquisition of EEG and NIRS during Cognitive Tasks for an Open Access Dataset We provide an open access multimodal brain-imaging dataset of simultaneous electroencephalography (EEG) and near-infrared

More information

Chapter 11: Sound, The Auditory System, and Pitch Perception

Chapter 11: Sound, The Auditory System, and Pitch Perception Chapter 11: Sound, The Auditory System, and Pitch Perception Overview of Questions What is it that makes sounds high pitched or low pitched? How do sound vibrations inside the ear lead to the perception

More information

Altered Dynamic of EEG Oscillations in Fibromyalgia Patients at Rest

Altered Dynamic of EEG Oscillations in Fibromyalgia Patients at Rest Altered Dynamic of EEG Oscillations in Fibromyalgia Patients at Rest Ana M. González-Roldán, PhD Ignacio Cifre, PhD Carolina Sitges, PhDPedro Montoya, PhD Pain Medicine, Volume 17, Issue 6, 1 June 2016,

More information

EEG reveals divergent paths for speech envelopes during selective attention

EEG reveals divergent paths for speech envelopes during selective attention EEG reveals divergent paths for speech envelopes during selective attention Cort Horton a, Michael D Zmura a, and Ramesh Srinivasan a,b a Dept. of Cognitive Sciences, University of California, Irvine,

More information

ERP Correlates of Identity Negative Priming

ERP Correlates of Identity Negative Priming ERP Correlates of Identity Negative Priming Jörg Behrendt 1,3 Henning Gibbons 4 Hecke Schrobsdorff 1,2 Matthias Ihrke 1,3 J. Michael Herrmann 1,2 Marcus Hasselhorn 1,3 1 Bernstein Center for Computational

More information

Brain and Cognition. Cognitive Neuroscience. If the brain were simple enough to understand, we would be too stupid to understand it

Brain and Cognition. Cognitive Neuroscience. If the brain were simple enough to understand, we would be too stupid to understand it Brain and Cognition Cognitive Neuroscience If the brain were simple enough to understand, we would be too stupid to understand it 1 The Chemical Synapse 2 Chemical Neurotransmission At rest, the synapse

More information

EEG History. Where and why is EEG used? 8/2/2010

EEG History. Where and why is EEG used? 8/2/2010 EEG History Hans Berger 1873-1941 Edgar Douglas Adrian, an English physician, was one of the first scientists to record a single nerve fiber potential Although Adrian is credited with the discovery of

More information

PD233: Design of Biomedical Devices and Systems

PD233: Design of Biomedical Devices and Systems PD233: Design of Biomedical Devices and Systems (Lecture-7 Biopotentials- 2) Dr. Manish Arora CPDM, IISc Course Website: http://cpdm.iisc.ac.in/utsaah/courses/ Electromyogram (EMG) Skeletal muscles are

More information

WAVELET ENERGY DISTRIBUTIONS OF P300 EVENT-RELATED POTENTIALS FOR WORKING MEMORY PERFORMANCE IN CHILDREN

WAVELET ENERGY DISTRIBUTIONS OF P300 EVENT-RELATED POTENTIALS FOR WORKING MEMORY PERFORMANCE IN CHILDREN WAVELET ENERGY DISTRIBUTIONS OF P300 EVENT-RELATED POTENTIALS FOR WORKING MEMORY PERFORMANCE IN CHILDREN Siti Zubaidah Mohd Tumari and Rubita Sudirman Department of Electronic and Computer Engineering,

More information

Title change detection system in the visu

Title change detection system in the visu Title Attention switching function of mem change detection system in the visu Author(s) Kimura, Motohiro; Katayama, Jun'ich Citation International Journal of Psychophys Issue Date 2008-02 DOI Doc URLhttp://hdl.handle.net/2115/33891

More information

NeuroImage 50 (2010) Contents lists available at ScienceDirect. NeuroImage. journal homepage:

NeuroImage 50 (2010) Contents lists available at ScienceDirect. NeuroImage. journal homepage: NeuroImage 50 (2010) 329 339 Contents lists available at ScienceDirect NeuroImage journal homepage: www.elsevier.com/locate/ynimg Switching associations between facial identity and emotional expression:

More information

Chapter 5. Summary and Conclusions! 131

Chapter 5. Summary and Conclusions! 131 ! Chapter 5 Summary and Conclusions! 131 Chapter 5!!!! Summary of the main findings The present thesis investigated the sensory representation of natural sounds in the human auditory cortex. Specifically,

More information