Low-Level Visual Processing Speed Modulates Judgment of Audio-Visual Simultaneity

Size: px
Start display at page:

Download "Low-Level Visual Processing Speed Modulates Judgment of Audio-Visual Simultaneity"

Transcription

1 Interdisciplinary Information Sciences Vol. 21, No. 2 (2015) #Graduate School of Information Sciences, Tohoku University ISSN print/ online DOI /iis.2015.A.01 Low-Level Visual Processing Speed Modulates Judgment of Audio-Visual Simultaneity Yasuhiro TAKESHIMA,y and Jiro GYOBA Department of Psychology, Graduate School of Arts and Letters, Tohoku University, Sendai , Japan Received December 5, 2014; final version accepted May 26, 2015 Temporal consistency between visual and auditory presentations is necessary for integration of visual and auditory information. Subjective simultaneity perception is more important than the synchrony of physical inputs for temporal consistency. Our previous studies have shown that audio-visual integration is difficult even if the visual and auditory inputs are physically synchronous when visual processing is slow. In the present study, we examined the effects of visual processing speed on audio-visual integration using a simultaneity judgment task. Visual processing speed was manipulated by varying the spatial frequency of visual stimuli. High spatial frequency stimuli require a longer processing time because visual responses to high spatial frequencies are slow. The results indicated that the difference between subjective and physical synchrony was larger in high spatial frequency than in low spatial frequency. Thus, the spatial frequency of the visual stimulus affected the judgments of simultaneity for visual and auditory stimuli. The effects of visual processing speed on audio-visual integration are believed to occur at a lower-order stage of sensory processing. KEYWORDS: processing speed, simultaneous judgment task, gabor patch 1. Introduction When perception in one sensory modality is ambiguous, information from other sensory modalities reduces this ambiguity [1]. Therefore, humans perceive a stable outer environment via multisensory inputs. Multisensory integration has been studied extensively. In particular, many studies have examined the process involved in integrating visual and auditory information [e.g., 2 4]. However, audio-visual integration does not always occur. For example, temporal consistency between visual and auditory information is necessary. In single-cell recording studies, the neuronal responses in the superior colliculus (SC) are enhanced by temporal consistency between visual and auditory inputs [5]. Moreover, human behavioral data show that visual target sensitivity can be facilitated by temporally consistent auditory input [6]. In human neural networks, SC activity is related to audio-visual integration [7]. Therefore, temporal consistency is important for audiovisual integration. However, a range of subjective synchrony perception between the temporal onset of visual and auditory stimuli is tolerated [8], from 130 ms (sound preceding) to +250 ms (sound following) [9]. Several crossmodal illusions induced by sound (i.e., audio-visual interactions) have temporal windows within this range [e.g., 10 14]. Therefore, subjective synchrony seems to be more important for audio-visual integration than physical synchrony. In a previous study, we hypothesized that the timing of audio-visual onset would be judged as more asynchronous when visual processing was slower [15]. Takeshima and Gyoba [15] investigated the effects of processing speed on the fission illusion, which is induced in audio-visual integration by manipulating the complexity of visual stimuli. In the fission illusion, when a single flash is presented simultaneously with two beeps, two flashes are perceived [11, 12]. The results indicated that it was difficult to induce the fission illusion with complex visual stimuli, suggesting that audiovisual integration was occurs with more difficulty with complex visual stimuli, for which processing speed is slow. Our hypothesis, that slow processing of sensory stimuli affects the simultaneous perception of auditory and visual information, was supported in a study investigating the effects of object quantity [16] in addition to complexity [15]. In the present study, we re-examined the effects of processing speed on audio-visual integration with respect to the following. First, it has been suggested that audio-visual integration occurs at multiple stages of sensory processing. In various anatomical and physiological studies, multisensory information converges not only in higher-order association cortices, but also in early sensory areas (i.e., primal visual, auditory, or somatosensory areas) [17 19 for a review]. Moreover, Mishra et al. [20] have measured the event-related potentials (ERPs) in the fission illusion. The results indicated that some ERP components were observed at various latencies. Sanabria et al. [21] have examined the effects Corresponding author. yasuhiro.takeshima@gmail.com y Current address: Department of Psychology, Faculty of Human Studies, Bunkyo Gakuin University, Saitama , Japan.

2 110 TAKESHIMA and GYOBA of audio-visual integration on motion perception using signal detection theory. In signal detection theory, two indices are calculated: the d-prime and criterion [22]. The d-prime indicates the effects at the perceptual level, and the criterion indicates the effects at the cognitive level. The results indicated that both d-prime and criterion scores were changed by audio-visual integration in motion perception. Therefore, effects of processing speed should be observed at both higherorder (i.e., complexity of visual stimuli) and lower-order levels of processing. We manipulated the spatial frequency of visual stimuli in this study. It is well known that spatial frequency information is processed in the early visual cortex (V1) and responses to high spatial frequency visual stimuli are slow [23]. Second, we used a simultaneity judgment (SJ) task in the present study, while in our previous studies the effects of processing speed were measured by the strength of the fission illusion or performance on a same-different task [15, 16]. If visual processing speed influences subjective temporal synchrony perception, simultaneity judgments for visual and auditory stimuli should also be altered. 2. Method 2.1 Participants Nine observers (four women and five men, mean age ¼ 23:9 1:36 years) participated in this experiment. All participants reported normal or corrected-to-normal vision and normal audition. None were informed of the purpose of the experiment except for one participant (the first author). 2.2 Apparatus Stimuli were generated and controlled by a custom-made program written with MATLAB (MathWorks, Inc.), Cogent Graphics and 2000 toolbox ( and a PC (XPS720, Dell; OS: Windows Vista, Microsoft). The visual stimuli were displayed on a CRT-display (Trinitron GDM-F520, Sony; resolution: pixels; refresh rate: 60 Hz). The auditory stimuli were conveyed through an audio interface (Edirol FA-66, Roland) and headphones (HDA200, Sennheiser). Simultaneity of the visual and auditory stimuli was confirmed using a digital oscilloscope (TS-80600, Iwatsu). The experiment was conducted in a dark room with 43.6 db (A) of background noise. Participants viewed the monitor binocularly at a distance of 60 cm with their heads stabilized on a chin rest. 2.3 Stimuli Each trial consisted of the presentation of a red (15.3 cd/m 2 ) double circle (1.1 deg in diameter; for fixation) and visual stimuli. All stimuli were presented on a gray (17.9 cd/m 2 ) background. Visual stimuli were Gabor patches (Figure 1a) of two spatial frequencies, 1.0 and 5.0 cycles per degree (c/deg). The visual stimuli were 2.0 deg in diameter, and were presented for 17 ms. The auditory stimulus was a pure tone at 2000 Hz. The duration of the auditory stimulus was 17 ms (including ramp times of 1.7 ms at the beginning and end of the sound wave envelope), and the sound pressure level was 75 db (A). There were nine stimulus onset asynchronies (SOAs) between visual and auditory stimuli: 350, 233, 150, 67, 0, +67, +150, +233, +350 ms (negative SOAs indicate that the auditory stimulus was presented before the visual stimulus and vice versa). 2.4 Procedure A trial schematic is shown in Figure 1b. Trials were initiated by pressing the 0 key. Each trial consisted of a 500 ms fixation followed by blank and target displays. The duration of the blank display was randomized ( ms). Each participant completed 640 trials; there were more synchronous (N ¼ 320) than asynchronous (N ¼ 40 for each SOA) trials to approximately equal the a priori number of synchronous and asynchronous trials (for a similar methodology, see [24 26]). In the SJ task, participants were instructed to press 1 for simultaneity and 3 for asynchrony. After the SJ task, each participant performed a response time (RT) task for each spatial frequency Gabor patch. The trial sequence is shown in Fig. 1c. A fixation cross was presented at the center of the screen (500 ms), followed by a blank display (randomized duration, ms). Then, the visual stimulus was presented until a response was made. Participants were instructed to press the 5 key as soon as the visual stimulus was presented. There were 60 RT task trials (N ¼ 30 for each spatial frequency). 3. Results To compute the point of subjective simultaneity (PSS), the data from each participant were estimated by fitting the following four parameters Gaussian function to each individual s data (see also [25, 26]) by minimizing the root-meansquare-error (RMSE) in Microsoft Excel Solver: P (responsejsoa) ¼ blink rate þ a e ½ :5ðSOA-PSS b Þ 2 Š The SOA parameter was equal to the experimental condition ( 350 to +350 ms). The parameters a and b and blink rate required estimation. The blink rate parameter was included to consider a small proportion of noisy trials in which participants were not attending to the actual stimuli (e.g., due to eye blinks or other artifacts), which would otherwise lead to an overestimation of the other parameters [27]. The same analysis methodology, including this parameter, was

3 Effects of Low-Level Visual Processing Speed on Audio-Visual Synchrony 111 (a) (b) 1 c/deg 5 c/deg fixation (500 ms) blank (500 ~ 1000 ms) target (17 ms) blank (0 ~ 400 ms) response (17 ms) (c) time fixation (500 ms) blank (500 ~ 1000 ms) target (until response) time Fig. 1. (a) Low spatial frequency (1 c/deg; left) and high spatial frequency (5 c/deg; right) Gabor patches used in the experiment. (b) Schematic representation of the simultaneous judgment task. (c) Schematic representation of the response time experiment. conducted in the previous study [25]. The blink rate was restricted to a minimum of 0% and maximum of 2.5% and was estimated to be 1.8% (1 c/deg: 1.5%; 5 c/deg: 2.1%) in this experiment. The other parameters were restricted to a minimum of 0. The results are shown in Figure 2a, which presents the mean percentage of simultaneity responses, as a function of spatial frequency and SOA, together with fitted psychometric functions (1 c/deg: Mean RMSE ¼ 0:10, SD ¼ 0:06; 5 c/deg: Mean RMSE ¼ 0:10, SD ¼ 0:05). A two-way analysis of variance (ANOVA) with spatial frequency and SOA as within-participants factors was conducted. The results revealed a significant main effect of SOA (F ð8; 64Þ ¼122:87, p <:001, 2 p ¼ :93), indicating that participants complied with the SJ task. Furthermore, the interaction between spatial frequency and SOA was also significant (F ð8; 64Þ ¼3:84, p <:001, 2 p ¼ :33). The simple main effects of spatial frequency were significant in the +233 and +350 ms SOA conditions (+233 ms SOA: F ð1; 72Þ ¼7:39, p <:05, 2 p ¼ :09; +350 ms SOA: F ð1; 72Þ ¼5:94, p <:05, 2 p ¼ :08), indicating that the percentage of simultaneity responses was higher for 5 c/deg than 1 c/deg Gabor patches. However, a main effect of spatial frequency was not significant (F ð1; 8Þ ¼0:35, p ¼ :57, 2 p ¼ :04). Moreover, PSS scores were computed as SOA score corresponding to the peak position of simultaneous judgment in each individual s psychometric function. A twotailed t-test was conducted to compare the PSS values between spatial frequency conditions. Mean PSS is shown in Figure 2b. PSS was larger in the 5 c/deg than in the 1 c/deg condition (t ð8þ ¼2:82, p <:05, d ¼ :23). The 95% confidence interval was from ms to ms in the 1 c/deg condition, and from ms to ms in the 5 c/deg condition. Moreover, a two-tailed t-test on the RT data confirmed that there was a difference in processing speed between spatial frequency conditions. Mean RTs are shown in Figure 2c. The results indicated that the RT was slower in the 5 c/deg than in the 1 c/deg condition (t ð8þ ¼2:82, p <:05, d ¼ :30). 4. Discussion In the present study, we examined the effects of spatial frequency on synchrony perception of visual and auditory stimuli. The results indicated that there was a larger difference between the PSS values and 0 ms for high spatial frequency (5 c/deg) versus low spatial frequency (1 c/deg) stimuli in the SJ task. Moreover, the difference in processing speed between visual stimuli with different spatial frequencies was confirmed by measuring RT. If

4 112 TAKESHIMA and GYOBA simultaneous response (%) (a) c/deg 5 c/deg auditory precedence SOA (ms) visual precedence PSS (ms) (b) response time (ms) (c) spatial frequency (c/deg) spatial frequency (c/deg) Fig. 2. Experimental results. (a) Mean percentage of simultaneous responses. The average fitting functions are plotted for 1 c/deg and 5 c/deg Gabor patch data. (b) Mean point of subjective synchrony. Error bars show 95% confidence interval (n ¼ 9). (c) Results of response time experiment. Error bars show standard errors of the mean. subjective synchrony perception can be determined by physical synchrony between visual and auditory inputs, the PSS value becomes 0 ms. In general, the PSS values were shifted in the positive direction, indicating that simultaneity was maximally perceived if visual inputs slightly preceded auditory input (e.g., [7, 25, 26, 28, 29]). For example, Zampini et al. [25] have shown that the PSS shifts to within 30 ms. Fujisaki et al. [28] have revealed that the synchrony-asynchrony discrimination threshold (index near to PSS) was approximately 70 ms. This PSS shift reflects the difference in neural latency between visual and auditory neurons [30]; neural latency is slower for visual versus auditory neurons [31]. On the other hand, the PSS values shifted approximately 100 ms for both high and low spatial frequency stimuli in this experiment. The PSS shift was larger in the present study compared with previous studies (e.g., [7, 25, 26, 28, 29]). This difference may be attributed to the blurred edge of the Gabor patch. Moreover, PSS values differed significantly between high and low spatial frequency stimuli. Therefore, the present study clearly indicates that temporal synchrony perception of visual and auditory stimuli is affected by the spatial frequency of the visual stimulus. In the present study, RT was longer for high compared to low spatial frequency Gabor patches. This is consistent with previous studies showing that high spatial frequency stimuli are preferentially processed by sustained channels, while low spatial frequency stimuli are preferentially processed by transient channels [e.g., 32 34]. Because conduction speed is slower for neural fibers in sustained versus transient channels [35], visual processing speed is slower for high versus low spatial frequency stimuli [19]. Thus, the present results also suggest that the slow processing speed of visual stimuli affects integration of visual and auditory information [15, 16]. The present data showed that the sensory processing speed affected simultaneous judgment only in visual precedence conditions (i.e., +233 and +350 ms SOA). This asymmetry effect would be attributed to attunement toward the natural situation. In a natural scene, a visual stimulus (light) always precedes the auditory stimulus (sound). The situation when sounds precede light could not occur in a natural situation. Thus, simultaneous judgment would be more sensitive to sensory processing speed in visual precedence conditions than in auditory precedence conditions. It is well known that spatial frequency information is processed at early visual cortex (V1). The previous study has shown that the left temporal parietal junction (TPJ) activations are modulated by the audio-visual synchrony

5 Effects of Low-Level Visual Processing Speed on Audio-Visual Synchrony 113 percept [36]. Furthermore, the connectivity between the left inferior parietal lobe (IPL) and the right dorsolateral prefrontal cortex (DLPFC), and the right TPJ are considered to be related to timing percept [36]. Spatial frequency information is conveyed to these brain areas and used to discriminate the audio-visual synchrony percept. Alternatively, the difference in conduction speed between visual and auditory signals to the TPJ would be used to this synchronous discrimination. These brain areas activities would conduct modifications for audio-visual timing percept according to various factors (e.g., distance from information resources [37], the difference in response latency [31]). The difference in response time would not be reflected directly in the difference in PSS between low and high spatial frequency stimuli because these modifications are conducted in audio-visual simultaneous judgment processes. In our previous studies, audio-visual interaction was affected by visual processing speed based on complexity [15] and object quantity [16]. The complexity of visual stimuli is processed in the anterior inferotemporal cortex, posterior inferotemporal cortex, and V4 [38]. Moreover, the effects of object quantity likely occur during encoding [16], which is related to inferior parietal sulcus activity [39, 40]. In contrast, it is well known that spatial frequency information is processed in the early visual cortex (V1). Unfortunately, it remains a possibility that the difference of contrast may affect the audio-visual synchrony judgment according to insufficient contrast control between low and high spatial frequency stimuli. However, it is also well known that contrast information is processed in early visual areas. In other words, both spatial frequency and contrast processing of visual stimuli precede in complexity processing of visual stimuli. Therefore, the present study clearly indicates that the effects of processing speed on audio-visual interaction occur at lower-order (i.e., responsible for processing spatial frequency information) in addition to higher-order (i.e., responsible for stimulus complexity) levels of processing. Acknowledgments We are grateful to the participants for their time and contribution. This research was supported by the Japanese Society for the Promotion of Science KAKENHI (Grant-in-Aid for Fellows: No ). REFERENCES [1] Sumby, W. H., and Pollack, I., Visual contribution to speech intelligibility in noise, J. Acoust. Soc. Am., 26: (1954). [2] Koelewijn, T., Bronkhorst, A., and Theeuwes, J., Attention and the multiple stages of multisensory integration: A review of audiovisual studies, Acta Psychol., 134: (2010). [3] Spence, C., Audiovisual multisensory integration, Acoust. Sci. Technol., 28: (2007). [4] Talsma, D., Senkowski, D., Soto-Faraco, S., and Woldorff, M. G., The multifaceted interplay between attention and multisensory integration, Trends Cogn. Sci., 14: (2010). [5] Meredith, M. A., Nemitz, J. W., and Stein, B. E., Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors, J. Neurosci., 7: (1987). [6] Bolignini, N., Frassinetti, F., Serino, A., and Ladavas, E., Acoustical vision of below threshold stimuli: Interaction among spatially converging audiovisual inputs, Exp. Brain Res., 160: (2005). [7] Lewald, J., and Guski, R., Cross-modal perceptual integration of spatially and temporally disparate auditory and visual stimuli, Cognit. Brain Res., 16: (2003). [8] Guski, R., and Troje, N. F., Audiovisual phenomenal causality, Percept. Psychophys., 65: (2003). [9] Fairhall, S. L., and Macaluso, E., Spatial attention can modulate audiovisual integration at multiple cortical and subcortical sites, Eur. J. Neurosci., 29: (2009). [10] Hidaka, S., Manaka, Y., Teramoto, W., Sugita, Y., Miyauchi, R., Gyoba, J., Szuki, Y., and Iwaya, Y., Alternation of sound location induces visual motion perception of a static object, PLOS ONE, 4(12): e8188 (2009). [11] Remijn, G. B., Ito, H., and Nakajima, Y., Audiovisual integration: An investigation of the Streaming-bounding phenomenon, J. Physiol. Anthropol. Appl. Hum. Sci., 23: (2004). [12] Shams, L., Kamitani, Y., and Shimojo, S., What you see is what you hear, Nature, 408: 788 (2000). [13] Shams, L., Kamitani, Y., and Shimojo, S., Visual illusion induced by sound, Cognit. Brain Res., 14: (2002). [14] Takeshima, Y., and Gyoba, J., High-intensity sound increases size of visually perceived objects, Atten. Percept. Psychophys., 75: (2013a). [15] Takeshima, Y., and Gyoba, J., Complexity of visual stimuli affects visual illusion induced by sound, Vis. Res., 91: 1 7 (2013b). [16] Takeshima, Y., and Gyoba, J., Pattern dot quantity affects auditory facilitation effects on visual object representations, Percept., 43: (2014). [17] Calvert, G., and Thesen, T., Multisensory integration: Methodological approaches and emerging principles in the human brain, J. Physiol. (Paris), 98: (2004). [18] Ghazanfar, A. A., and Schroeder, C. E., Is neocortex essentially multisensory? Trends Cognit. Sci., 10: (2006). [19] Schroeder, C. E., and Foxe, J., Multisensory contributions to low-leve, unisensory processing, Curr. Opin. Neurobiol., 15: (2005). [20] Mishra, J., Martinez, A., Sejnowski, T., and Hillyard, S. A., Early cross-modal interactions in auditory and visual cortex underlie a sound-induced visual illusion, J. Neurosci., 27: (2007). [21] Sanabria, D., Spence, C., and Soto-Faraco, S., Perceptual and decisional contributions to audiovisual interactions in the perception of apparent motion: A signal detection study, Cognit., 102: (2007). [22] Macmillan, N. A., and Creelman, C. D., Detection Theory: A User s Guide (2nd ed.), Psychology Press (2005).

6 114 TAKESHIMA and GYOBA [23] Breitmeyer, B. C., Simple reaction time as a measure of the temporal response properties of transient and sustained channels, Vis. Res., 15: (1975). [24] Santangelo, V., and Spence, C., Crossmodal attentional capture in an unspeeded simultaneity judgment task, Vis. Cogni., 16: (2008). [25] Van der Burg, R., Plivers, C. N. L., Bronkhorst, A. W., and Theeuwes, J., Audiovisual events capture attention: Evidence from temporal order judgments, J. Vis., 8(5): 1 10 (2008). [26] Zampini, M., Guest, S., Shore, D. I., and Spence, C., Audio-visual simultaneity judgment, Percept. Psychophys., 67: (2005). [27] Swanson, W. H., and Birch, E. E., Extracting thresholds from noisy psychophysical data, Percept. Psychophys., 51: (1992). [28] Fujisaki, W., and Nishida, S., Temporal frequency characteristics of synchrony-asynchrony discrimination of audio-visual signals, Exp. Brain Res., 166: (2005). [29] Kayser, C., Petkov, C., and Logothetis, N., Visual modulation of neurons in auditory cortex, Cerebr. Cortex, 18: (2008). [30] Vroomen, J., and Keetels, M., Perception of intersensory synchrony: a tutorial review, Attention Percept. Psychophys., 72: (2010). [31] King, A. J., Multisensory integration: Strategies for synchronization, Curr. Biol., 15: R339 R341 (2010). [32] Breitmeyer, B., and Julesz, B., The role of on and off transients in determining the psychophysical spatial frequency response, Vis. Res., 15: (1975). [33] Kulikowski, J. J., and Tolhurst, D. J., Psychophysical evidence for sustained and transient detectors in human vision, J. Physiol., Lond., 232: (1973). [34] Tolhurst, D. J., Separate channels for the analysis of the shape and movement of a moving visual stimulus, J. Physiol., Lond., 231: (1973). [35] Hoffman, K.-P., Conduction velocity in pathways from retinal to superior colliculus in the cat: A correlation with receptive field properties, J. Neurophysi., 36: (1973). [36] Adhikari, B. M., Goshorn, E. S., Lamichhane, B., and Dhamala, M., Temporal-order judgment of audiovisual events involves network activity between parietal and prefrontal cortices, Brain Connectivity, 3: (2013). [37] Sugita, Y., and Suzuki, Y., Implicit estimation of sound-arrival time, Nature, 421: (2003). [38] Kobatake, E., and Tanaka, K., Neuronal selectivities to complex object features in the ventral visual pathway of the macaque cerebral cortex, J. Neurophysiol., 71: (1994). [39] Todd, J. J., and Marois, R., Capacity limit of visual short-term memory in human posterior parietal cortex, Nature, 428: (2004). [40] Todd, J. J., and Marois, R., Posterior parietal cortex activity predicts individual differences in visual short-term memory capacity, Cognit. Affect. Behav. Neurosci., 5: (2005).

PERCEPTION OF AUDITORY-VISUAL SIMULTANEITY CHANGES BY ILLUMINANCE AT THE EYES

PERCEPTION OF AUDITORY-VISUAL SIMULTANEITY CHANGES BY ILLUMINANCE AT THE EYES 23 rd International Congress on Sound & Vibration Athens, Greece 10-14 July 2016 ICSV23 PERCEPTION OF AUDITORY-VISUAL SIMULTANEITY CHANGES BY ILLUMINANCE AT THE EYES Hiroshi Hasegawa and Shu Hatakeyama

More information

Recalibration of temporal order perception by exposure to audio-visual asynchrony Vroomen, Jean; Keetels, Mirjam; de Gelder, Bea; Bertelson, P.

Recalibration of temporal order perception by exposure to audio-visual asynchrony Vroomen, Jean; Keetels, Mirjam; de Gelder, Bea; Bertelson, P. Tilburg University Recalibration of temporal order perception by exposure to audio-visual asynchrony Vroomen, Jean; Keetels, Mirjam; de Gelder, Bea; Bertelson, P. Published in: Cognitive Brain Research

More information

Neuroscience Letters

Neuroscience Letters Neuroscience Letters 450 (2009) 60 64 Contents lists available at ScienceDirect Neuroscience Letters journal homepage: www. elsevier. com/ locate/ neulet Poke and pop: Tactile visual synchrony increases

More information

Report. Direction of Visual Apparent Motion Driven Solely by Timing of a Static Sound. Elliot Freeman 1,2, * and Jon Driver 2 1

Report. Direction of Visual Apparent Motion Driven Solely by Timing of a Static Sound. Elliot Freeman 1,2, * and Jon Driver 2 1 Current Biology 18, 1262 1266, August 26, 2008 ª2008 Elsevier Ltd All rights reserved DOI 10.1016/j.cub.2008.07.066 Direction of Visual Apparent Motion Driven Solely by Timing of a Static Sound Report

More information

Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities

Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities Exp Brain Res (2010) 201:393 399 DOI 10.1007/s00221-009-2047-3 RESEARCH ARTICLE Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities Yoshimori Sugano Mirjam Keetels Jean

More information

Effects of stimulus duration on audio-visual synchrony perception

Effects of stimulus duration on audio-visual synchrony perception Exp Brain Res (2012) 221:403 412 DOI 10.1007/s00221-012-3182-9 RESEARCH ARTICLE Effects of stimulus duration on audio-visual synchrony perception I. A. Kuling R. L. J. van Eijk J. F. Juola A. Kohlrausch

More information

Coexistence of Multiple Modal Dominances

Coexistence of Multiple Modal Dominances Coexistence of Multiple Modal Dominances Marvin Chandra (mchandra@hawaii.edu) Department of Psychology, University of Hawaii at Manoa 2530 Dole Street, Honolulu, HI 96822, USA Christopher W. Robinson (robinson.777@osu.edu)

More information

Vision Research. Audiovisual integration of stimulus transients. Tobias S. Andersen a,b, *, Pascal Mamassian c. abstract

Vision Research. Audiovisual integration of stimulus transients. Tobias S. Andersen a,b, *, Pascal Mamassian c. abstract Vision Research 48 (2008) 2537 2544 Contents lists available at ScienceDirect Vision Research journal homepage: www.elsevier.com/locate/visres Audiovisual integration of stimulus transients Tobias S. Andersen

More information

Supplemental Information: Task-specific transfer of perceptual learning across sensory modalities

Supplemental Information: Task-specific transfer of perceptual learning across sensory modalities Supplemental Information: Task-specific transfer of perceptual learning across sensory modalities David P. McGovern, Andrew T. Astle, Sarah L. Clavin and Fiona N. Newell Figure S1: Group-averaged learning

More information

Perception of Synchrony between the Senses

Perception of Synchrony between the Senses 9 Perception of Synchrony between the Senses Mirjam Keetels and Jean Vroomen Contents 9.1 Introduction... 147 9.2 Measuring Intersensory Synchrony: Temporal Order Judgment Task and Simultaneity Judgment

More information

Audio-Visual Speech Timing Sensitivity Is Enhanced in Cluttered Conditions

Audio-Visual Speech Timing Sensitivity Is Enhanced in Cluttered Conditions Audio-Visual Speech Timing Sensitivity Is Enhanced in Cluttered Conditions Warrick Roseboom 1 *, Shin ya Nishida 2, Waka Fujisaki 3, Derek H. Arnold 1 1 School of Psychology, The University of Queensland,

More information

The effects of subthreshold synchrony on the perception of simultaneity. Ludwig-Maximilians-Universität Leopoldstr 13 D München/Munich, Germany

The effects of subthreshold synchrony on the perception of simultaneity. Ludwig-Maximilians-Universität Leopoldstr 13 D München/Munich, Germany The effects of subthreshold synchrony on the perception of simultaneity 1,2 Mark A. Elliott, 2 Zhuanghua Shi & 2,3 Fatma Sürer 1 Department of Psychology National University of Ireland Galway, Ireland.

More information

Tilburg University. The spatial constraint in intersensory pairing Vroomen, Jean; Keetels, Mirjam

Tilburg University. The spatial constraint in intersensory pairing Vroomen, Jean; Keetels, Mirjam Tilburg University The spatial constraint in intersensory pairing Vroomen, Jean; Keetels, Mirjam Published in: Journal of Experimental Psychology. Human Perception and Performance Document version: Publisher's

More information

Multimodal interactions: visual-auditory

Multimodal interactions: visual-auditory 1 Multimodal interactions: visual-auditory Imagine that you are watching a game of tennis on television and someone accidentally mutes the sound. You will probably notice that following the game becomes

More information

Seeing Sound: Changing Visual Perception Through Cross-Modal Interaction. Tracey D. Berger. New York University. Department of Psychology

Seeing Sound: Changing Visual Perception Through Cross-Modal Interaction. Tracey D. Berger. New York University. Department of Psychology Cross-Modal Effects on Perception 1 Seeing Sound: Changing Visual Perception Through Cross-Modal Interaction. Tracey D. Berger New York University Department of Psychology Faculty Sponsor : Denis Pelli

More information

Recent studies have shown that the brain can adjust the processing

Recent studies have shown that the brain can adjust the processing Adaptation to audiovisual asynchrony modulates the speeded detection of sound Jordi Navarra a,b,1, Jessica Hartcher-O Brien b, Elise Piazza b,c, and Charles Spence b a Fundació Sant Joan de Déu, Hospital

More information

Tilburg University. Auditory grouping occurs prior to intersensory pairing Keetels, Mirjam; Stekelenburg, Jeroen; Vroomen, Jean

Tilburg University. Auditory grouping occurs prior to intersensory pairing Keetels, Mirjam; Stekelenburg, Jeroen; Vroomen, Jean Tilburg University Auditory grouping occurs prior to intersensory pairing Keetels, Mirjam; Stekelenburg, Jeroen; Vroomen, Jean Published in: Experimental Brain Research Document version: Publisher's PDF,

More information

Hearing II Perceptual Aspects

Hearing II Perceptual Aspects Hearing II Perceptual Aspects Overview of Topics Chapter 6 in Chaudhuri Intensity & Loudness Frequency & Pitch Auditory Space Perception 1 2 Intensity & Loudness Loudness is the subjective perceptual quality

More information

Effect of pitch space correspondence on sound induced visual motion perception

Effect of pitch space correspondence on sound induced visual motion perception DOI 10.1007/s00221-013-3674-2 Research Article Effect of pitch space correspondence on sound induced visual motion perception Souta Hidaka Wataru Teramoto Mirjam Keetels Jean Vroomen Received: 14 March

More information

Quantifying temporal ventriloquism in audiovisual synchrony perception

Quantifying temporal ventriloquism in audiovisual synchrony perception Atten Percept Psychophys (2013) 75:1583 1599 DOI 10.3758/s13414-013-0511-4 Quantifying temporal ventriloquism in audiovisual synchrony perception Irene A. Kuling & Armin Kohlrausch & James F. Juola Published

More information

Effects of Auditory Input on a Spatial Serial Response Time Task

Effects of Auditory Input on a Spatial Serial Response Time Task Effects of Auditory Input on a Spatial Serial Response Time Task Christopher W. Robinson (robinson.777@osu.edu) Department of Psychology, The Ohio State University at Newark 1179 University Dr, Newark,

More information

Perceived Audiovisual Simultaneity in Speech by Musicians and Non-musicians: Preliminary Behavioral and Event-Related Potential (ERP) Findings

Perceived Audiovisual Simultaneity in Speech by Musicians and Non-musicians: Preliminary Behavioral and Event-Related Potential (ERP) Findings The 14th International Conference on Auditory-Visual Speech Processing 25-26 August 2017, Stockholm, Sweden Perceived Audiovisual Simultaneity in Speech by Musicians and Non-musicians: Preliminary Behavioral

More information

Some methodological aspects for measuring asynchrony detection in audio-visual stimuli

Some methodological aspects for measuring asynchrony detection in audio-visual stimuli Some methodological aspects for measuring asynchrony detection in audio-visual stimuli Pacs Reference: 43.66.Mk, 43.66.Lj Van de Par, Steven ; Kohlrausch, Armin,2 ; and Juola, James F. 3 ) Philips Research

More information

Dynamic and predictive links between touch and vision

Dynamic and predictive links between touch and vision Exp Brain Res (2002) 145:50 55 DOI 10.1007/s00221-002-1085-x RESEARCH ARTICLE Rob Gray Hong Z. Tan Dynamic and predictive links between touch and vision Received: 21 August 2001 / Accepted: 25 February

More information

Audio-visual synchrony perception

Audio-visual synchrony perception Audio-visual synchrony perception van Eijk, R.L.J. DOI: 10.6100/IR634898 Published: 01/01/2008 Document Version Publisher s PDF, also known as Version of Record (includes final page, issue and volume numbers)

More information

The Plasticity of Temporal Perception: Perceptual Training Enhances Multisensory Temporal. Acuity. Matthew Allen De Niear.

The Plasticity of Temporal Perception: Perceptual Training Enhances Multisensory Temporal. Acuity. Matthew Allen De Niear. The Plasticity of Temporal Perception: Perceptual Training Enhances Multisensory Temporal Acuity By Matthew Allen De Niear Dissertation Submitted to the Faculty of the Graduate School of Vanderbilt University

More information

Modality Differences in Timing: Testing the Pacemaker Speed Explanation

Modality Differences in Timing: Testing the Pacemaker Speed Explanation Modality Differences in Timing: Testing the Pacemaker Speed Explanation Emily A. Williams (Emily.A.Williams@Manchester.ac.uk) Andrew J. Stewart (Andrew.J.Stewart@Manchester.ac.uk) Luke A. Jones (Luke.Jones@Manchester.ac.uk)

More information

Direction of visual apparent motion driven by perceptual organization of cross-modal signals

Direction of visual apparent motion driven by perceptual organization of cross-modal signals Journal of Vision (2013) 13(1):6, 1 13 http://www.journalofvision.org/content/13/1/6 1 Direction of visual apparent motion driven by perceptual organization of cross-modal signals NTT Communication Science

More information

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and education use, including for instruction at the authors institution

More information

Perceptual averaging by eye and ear: Computing summary statistics from multimodal stimuli

Perceptual averaging by eye and ear: Computing summary statistics from multimodal stimuli Atten Percept Psychophys (2012) 74:810 815 DOI 10.3758/s13414-012-0293-0 Perceptual averaging by eye and ear: Computing summary statistics from multimodal stimuli Alice R. Albrecht & Brian J. Scholl &

More information

Selective effects of auditory stimuli on tactile roughness perception

Selective effects of auditory stimuli on tactile roughness perception BRAIN RESEARCH 1242 (2008) 87 94 available at www.sciencedirect.com www.elsevier.com/locate/brainres Research Report Selective effects of auditory stimuli on tactile roughness perception Yuika Suzuki a,,

More information

Bayesian Inference Explains Perception of Unity and Ventriloquism Aftereffect: Identification of Common Sources of Audiovisual Stimuli

Bayesian Inference Explains Perception of Unity and Ventriloquism Aftereffect: Identification of Common Sources of Audiovisual Stimuli LETTER Communicated by Robert A. Jacobs Bayesian Inference Explains Perception of Unity and Ventriloquism Aftereffect: Identification of Common Sources of Audiovisual Stimuli Yoshiyuki Sato yoshi@sat.t.u-tokyo.ac.jp

More information

Visual and Auditory Velocity Perception and Multimodal Illusions. Katherine S. Gasaway. Advisor: Paul M. Corballis, PhD

Visual and Auditory Velocity Perception and Multimodal Illusions. Katherine S. Gasaway. Advisor: Paul M. Corballis, PhD 1 Running Head: VISUAL AND AUDITORY VELOCITY Visual and Auditory Velocity Perception and Multimodal Illusions Katherine S. Gasaway Advisor: Paul M. Corballis, PhD Reviewers: Paul M. Corballis, PhD, Eric

More information

An Account of the Neural Basis and Functional Relevance of Early Cross-Modal Interactions

An Account of the Neural Basis and Functional Relevance of Early Cross-Modal Interactions Student Psychology Journal, 2013, 1-14 An Account of the Neural Basis and Functional Relevance of Early Cross-Modal Interactions Nicholas Murray Trinity College, Dublin Correspondence: murrayn6@tcd.ie

More information

Simultaneity constancy

Simultaneity constancy Perception, 24, volume 33, pages 149 ^ 16 DOI:1.168/p5169 Simultaneity constancy Agnieszka Kopinska, Laurence R Harris Centre for Vision Research, York University, Toronto, Ontario M3J 1P3, Canada; e-mail:

More information

Auditory temporal modulation of the visual Ternus effect: the influence of time interval

Auditory temporal modulation of the visual Ternus effect: the influence of time interval DOI 10.1007/s00221-010-2286-3 RESEARCH ARTICLE Auditory temporal modulation of the visual Ternus effect: the influence of time interval Zhuanghua Shi Lihan Chen Hermann J. Müller Received: 6 August 2009

More information

USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES

USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES Varinthira Duangudom and David V Anderson School of Electrical and Computer Engineering, Georgia Institute of Technology Atlanta, GA 30332

More information

Acta Psychologica 134 (2010) Contents lists available at ScienceDirect. Acta Psychologica. journal homepage:

Acta Psychologica 134 (2010) Contents lists available at ScienceDirect. Acta Psychologica. journal homepage: Acta Psychologica 134 (2010) 198 205 Contents lists available at ScienceDirect Acta Psychologica journal homepage: www.elsevier.com/locate/actpsy Audiovisual semantic interference and attention: Evidence

More information

Electrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals

Electrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals Electrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals Takako Mitsudo *1, Yoshitaka Nakajima 2, Gerard B. Remijn 3, Hiroshige Takeichi 4, Yoshinobu Goto

More information

The basic hearing abilities of absolute pitch possessors

The basic hearing abilities of absolute pitch possessors PAPER The basic hearing abilities of absolute pitch possessors Waka Fujisaki 1;2;* and Makio Kashino 2; { 1 Graduate School of Humanities and Sciences, Ochanomizu University, 2 1 1 Ootsuka, Bunkyo-ku,

More information

Behavioural evidence for separate mechanisms of audiovisual temporal binding as a function of leading sensory modality

Behavioural evidence for separate mechanisms of audiovisual temporal binding as a function of leading sensory modality European Journal of Neuroscience, Vol. 43, pp. 1561 1568, 2016 doi:10.1111/ejn.13242 COGNITIVE NEUROSCIENCE Behavioural evidence for separate mechanisms of audiovisual temporal binding as a function of

More information

Audiovisual Simultaneity Judgment and Rapid Recalibration throughout the Lifespan

Audiovisual Simultaneity Judgment and Rapid Recalibration throughout the Lifespan RESEARCH ARTICLE Audiovisual Simultaneity Judgment and Rapid Recalibration throughout the Lifespan Jean-Paul Noel 1,2, Matthew De Niear 2,3, Erik Van der Burg 4,5, Mark T. Wallace 2,6,7 * a11111 1 Neuroscience

More information

Spectrograms (revisited)

Spectrograms (revisited) Spectrograms (revisited) We begin the lecture by reviewing the units of spectrograms, which I had only glossed over when I covered spectrograms at the end of lecture 19. We then relate the blocks of a

More information

CROSSMODAL PLASTICITY IN SPECIFIC AUDITORY CORTICES UNDERLIES VISUAL COMPENSATIONS IN THE DEAF "

CROSSMODAL PLASTICITY IN SPECIFIC AUDITORY CORTICES UNDERLIES VISUAL COMPENSATIONS IN THE DEAF Supplementary Online Materials To complement: CROSSMODAL PLASTICITY IN SPECIFIC AUDITORY CORTICES UNDERLIES VISUAL COMPENSATIONS IN THE DEAF " Stephen G. Lomber, M. Alex Meredith, and Andrej Kral 1 Supplementary

More information

Comparing Bayesian models for multisensory cue combination without mandatory integration

Comparing Bayesian models for multisensory cue combination without mandatory integration Comparing Bayesian models for multisensory cue combination without mandatory integration Ulrik R. Beierholm Computation and Neural Systems California Institute of Technology Pasadena, CA 9105 beierh@caltech.edu

More information

M Cells. Why parallel pathways? P Cells. Where from the retina? Cortical visual processing. Announcements. Main visual pathway from retina to V1

M Cells. Why parallel pathways? P Cells. Where from the retina? Cortical visual processing. Announcements. Main visual pathway from retina to V1 Announcements exam 1 this Thursday! review session: Wednesday, 5:00-6:30pm, Meliora 203 Bryce s office hours: Wednesday, 3:30-5:30pm, Gleason https://www.youtube.com/watch?v=zdw7pvgz0um M Cells M cells

More information

Temporal binding of auditory spatial information across dynamic binaural events

Temporal binding of auditory spatial information across dynamic binaural events Atten Percept Psychophys (2018) 80:14 20 https://doi.org/10.3758/s13414-017-1436-0 SHORT REPORT Temporal binding of auditory spatial information across dynamic binaural events G. Christopher Stecker 1

More information

Chapter 11: Sound, The Auditory System, and Pitch Perception

Chapter 11: Sound, The Auditory System, and Pitch Perception Chapter 11: Sound, The Auditory System, and Pitch Perception Overview of Questions What is it that makes sounds high pitched or low pitched? How do sound vibrations inside the ear lead to the perception

More information

Perceptual Gain and Perceptual Loss: Distinct Neural Mechanisms of Audiovisual Interactions*

Perceptual Gain and Perceptual Loss: Distinct Neural Mechanisms of Audiovisual Interactions* ISSN 1749-8023 (print), 1749-8031 (online) International Journal of Magnetic Resonance Imaging Vol. 01, No. 01, 2007, pp. 003-014 Perceptual Gain and Perceptual Loss: Distinct Neural Mechanisms of Audiovisual

More information

A Multimodal Paradigm for Investigating the Perisaccadic Temporal Inversion Effect in Vision

A Multimodal Paradigm for Investigating the Perisaccadic Temporal Inversion Effect in Vision A Multimodal Paradigm for Investigating the Perisaccadic Temporal Inversion Effect in Vision Leo G. Trottier (leo@cogsci.ucsd.edu) Virginia R. de Sa (desa@cogsci.ucsd.edu) Department of Cognitive Science,

More information

Theoretical Neuroscience: The Binding Problem Jan Scholz, , University of Osnabrück

Theoretical Neuroscience: The Binding Problem Jan Scholz, , University of Osnabrück The Binding Problem This lecture is based on following articles: Adina L. Roskies: The Binding Problem; Neuron 1999 24: 7 Charles M. Gray: The Temporal Correlation Hypothesis of Visual Feature Integration:

More information

Neurobiology of Hearing (Salamanca, 2012) Auditory Cortex (2) Prof. Xiaoqin Wang

Neurobiology of Hearing (Salamanca, 2012) Auditory Cortex (2) Prof. Xiaoqin Wang Neurobiology of Hearing (Salamanca, 2012) Auditory Cortex (2) Prof. Xiaoqin Wang Laboratory of Auditory Neurophysiology Department of Biomedical Engineering Johns Hopkins University web1.johnshopkins.edu/xwang

More information

Segregation from direction differences in dynamic random-dot stimuli

Segregation from direction differences in dynamic random-dot stimuli Vision Research 43(2003) 171 180 www.elsevier.com/locate/visres Segregation from direction differences in dynamic random-dot stimuli Scott N.J. Watamaniuk *, Jeff Flinn, R. Eric Stohr Department of Psychology,

More information

Influences of intra- and crossmodal grouping on visual and

Influences of intra- and crossmodal grouping on visual and Influences of intra- and crossmodal grouping on visual and tactile Ternus apparent motion Lihan Chen 1,2, Zhuanghua Shi 2, Hermann J. Müller 2,3 1. Department of Psychology, Peking University, 100871,

More information

When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices

When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices Neuroscience Letters xxx (2004) xxx xxx When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices F. Joassin a,, P. Maurage a, R. Bruyer a,

More information

Title of Thesis. Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential

Title of Thesis. Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential Title of Thesis Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential 2014 September Yang Weiping The Graduate School of Natural Science and Technology (Doctor s Course)

More information

C ontextual modulation is a general phenomenon that relates to changes in the perceived appearance of

C ontextual modulation is a general phenomenon that relates to changes in the perceived appearance of OPEN SUBJECT AREAS: HUMAN BEHAVIOUR CORTEX Received 1 April 2014 Accepted 13 November 2014 Published 28 November 2014 Correspondence and requests for materials should be addressed to U.P. (urip@post.tau.

More information

Reading Assignments: Lecture 5: Introduction to Vision. None. Brain Theory and Artificial Intelligence

Reading Assignments: Lecture 5: Introduction to Vision. None. Brain Theory and Artificial Intelligence Brain Theory and Artificial Intelligence Lecture 5:. Reading Assignments: None 1 Projection 2 Projection 3 Convention: Visual Angle Rather than reporting two numbers (size of object and distance to observer),

More information

Cross-modal facilitation of visual and tactile motion

Cross-modal facilitation of visual and tactile motion Submitted to Nature Neuroscience 8/2/2008 Cross-modal facilitation of visual and tactile motion Monica Gori, Giulio Sandini and David C. Burr 2,3. Istituto Italiano di Tecnologia, via Morego 30, 663 Genoa,

More information

MULTI-CHANNEL COMMUNICATION

MULTI-CHANNEL COMMUNICATION INTRODUCTION Research on the Deaf Brain is beginning to provide a new evidence base for policy and practice in relation to intervention with deaf children. This talk outlines the multi-channel nature of

More information

Attention: Neural Mechanisms and Attentional Control Networks Attention 2

Attention: Neural Mechanisms and Attentional Control Networks Attention 2 Attention: Neural Mechanisms and Attentional Control Networks Attention 2 Hillyard(1973) Dichotic Listening Task N1 component enhanced for attended stimuli Supports early selection Effects of Voluntary

More information

Neuroscience Tutorial

Neuroscience Tutorial Neuroscience Tutorial Brain Organization : cortex, basal ganglia, limbic lobe : thalamus, hypothal., pituitary gland : medulla oblongata, midbrain, pons, cerebellum Cortical Organization Cortical Organization

More information

Distortions of Subjective Time Perception Within and Across Senses

Distortions of Subjective Time Perception Within and Across Senses Distortions of Subjective Time Perception Within and Across Senses Virginie van Wassenhove 1 *, Dean V. Buonomano 2,3, Shinsuke Shimojo 1, Ladan Shams 2 1 Division of Biology, California Institute of Technology,

More information

Neural basis of auditory-induced shifts in visual time-order perception

Neural basis of auditory-induced shifts in visual time-order perception Neural basis of auditory-induced shifts in visual time-order perception John J McDonald 1, Wolfgang A Teder-Sälejärvi 2, Francesco Di Russo 3,4 & Steven A Hillyard 2 Attended objects are perceived to occur

More information

Signal detection measures cannot distinguish perceptual biases from response biases

Signal detection measures cannot distinguish perceptual biases from response biases Perception, 2015, volume 44, pages 289 300 doi:10.1068/p7908 Signal detection measures cannot distinguish perceptual biases from response biases Jessica K Witt 1, J Eric T Taylor 2, Mila Sugovic 3, John

More information

Revealing the Origin of the Audiovisual Bounce-Inducing Effect

Revealing the Origin of the Audiovisual Bounce-Inducing Effect Seeing and Perceiving 25 (2012) 223 233 brill.nl/sp Revealing the Origin of the Audiovisual Bounce-Inducing Effect Massimo Grassi and Clara Casco Dipartimento di Psicologia Generale, Università di Padova,

More information

Sensation and Perception. A. Sensation: awareness of simple characteristics B. Perception: making complex interpretations

Sensation and Perception. A. Sensation: awareness of simple characteristics B. Perception: making complex interpretations I. Overview Sensation and Perception A. Sensation: awareness of simple characteristics B. Perception: making complex interpretations C. Top-Down vs Bottom-up Processing D. Psychophysics -- thresholds 1.

More information

Attention enhances feature integration

Attention enhances feature integration Vision Research 43 (2003) 1793 1798 Rapid Communication Attention enhances feature integration www.elsevier.com/locate/visres Liza Paul *, Philippe G. Schyns Department of Psychology, University of Glasgow,

More information

Visual motion influences the contingent auditory motion aftereffect Vroomen, Jean; de Gelder, Beatrice

Visual motion influences the contingent auditory motion aftereffect Vroomen, Jean; de Gelder, Beatrice Tilburg University Visual motion influences the contingent auditory motion aftereffect Vroomen, Jean; de Gelder, Beatrice Published in: Psychological Science Publication date: 2003 Link to publication

More information

The Attentional Blink is Modulated by First Target Contrast: Implications of an Attention Capture Hypothesis

The Attentional Blink is Modulated by First Target Contrast: Implications of an Attention Capture Hypothesis The Attentional Blink is Modulated by First Target Contrast: Implications of an Attention Capture Hypothesis Simon Nielsen * (sini@imm.dtu.dk) Tobias S. Andersen (ta@imm.dtu.dk) Cognitive Systems Section,

More information

Consciousness The final frontier!

Consciousness The final frontier! Consciousness The final frontier! How to Define it??? awareness perception - automatic and controlled memory - implicit and explicit ability to tell us about experiencing it attention. And the bottleneck

More information

Visual stream segregation has been proposed as a method to measure visual

Visual stream segregation has been proposed as a method to measure visual Dyslexia and the assessment of visual attention. Bernt C Skottun Ullevålsalleen 4C, 0852 Oslo, Norway John R Skoyles Centre for Mathematics and Physics in the Life Sciences and Experimental Biology (CoMPLEX),

More information

Audiotactile interactions in temporal perception

Audiotactile interactions in temporal perception Psychon Bull Rev (2011) 18:429 454 DOI 10.3758/s13423-011-0070-4 Audiotactile interactions in temporal perception Valeria Occelli & Charles Spence & Massimiliano Zampini Published online: 12 March 2011

More information

FINE-TUNING THE AUDITORY SUBCORTEX Measuring processing dynamics along the auditory hierarchy. Christopher Slugocki (Widex ORCA) WAS 5.3.

FINE-TUNING THE AUDITORY SUBCORTEX Measuring processing dynamics along the auditory hierarchy. Christopher Slugocki (Widex ORCA) WAS 5.3. FINE-TUNING THE AUDITORY SUBCORTEX Measuring processing dynamics along the auditory hierarchy. Christopher Slugocki (Widex ORCA) WAS 5.3.2017 AUDITORY DISCRIMINATION AUDITORY DISCRIMINATION /pi//k/ /pi//t/

More information

Auditory dominance over vision in the perception of interval duration

Auditory dominance over vision in the perception of interval duration Exp Brain Res (29) 198:49 57 DOI 1.17/s221-9-1933-z RESEARCH ARTICLE Auditory dominance over vision in the perception of interval duration David Burr Æ Martin S. Banks Æ Maria Concetta Morrone Received:

More information

Differences in temporal frequency tuning between the two binocular mechanisms for seeing motion in depth

Differences in temporal frequency tuning between the two binocular mechanisms for seeing motion in depth 1574 J. Opt. Soc. Am. A/ Vol. 25, No. 7/ July 2008 Shioiri et al. Differences in temporal frequency tuning between the two binocular mechanisms for seeing motion in depth Satoshi Shioiri, 1, * Tomohiko

More information

Neurophysiology and Information

Neurophysiology and Information Neurophysiology and Information Christopher Fiorillo BiS 527, Spring 2011 042 350 4326, fiorillo@kaist.ac.kr Part 10: Perception Reading: Students should visit some websites where they can experience and

More information

The Meaning of the Mask Matters

The Meaning of the Mask Matters PSYCHOLOGICAL SCIENCE Research Report The Meaning of the Mask Matters Evidence of Conceptual Interference in the Attentional Blink Paul E. Dux and Veronika Coltheart Macquarie Centre for Cognitive Science,

More information

Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli

Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli Psychon Bull Rev (2011) 18:123 128 DOI 10.3758/s13423-010-0027-z Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli Shoko Kanaya & Kazuhiko Yokosawa Published

More information

Applying the summation model in audiovisual speech perception

Applying the summation model in audiovisual speech perception Applying the summation model in audiovisual speech perception Kaisa Tiippana, Ilmari Kurki, Tarja Peromaa Department of Psychology and Logopedics, Faculty of Medicine, University of Helsinki, Finland kaisa.tiippana@helsinki.fi,

More information

Are there Hemispheric Differences in Visual Processes that Utilize Gestalt Principles?

Are there Hemispheric Differences in Visual Processes that Utilize Gestalt Principles? Carnegie Mellon University Research Showcase @ CMU Dietrich College Honors Theses Dietrich College of Humanities and Social Sciences 2006 Are there Hemispheric Differences in Visual Processes that Utilize

More information

Fundamentals of Psychophysics

Fundamentals of Psychophysics Fundamentals of Psychophysics John Greenwood Department of Experimental Psychology!! NEUR3045! Contact: john.greenwood@ucl.ac.uk 1 Visual neuroscience physiology stimulus How do we see the world? neuroimaging

More information

CAN WE PREDICT STEERING CONTROL PERFORMANCE FROM A 2D SHAPE DETECTION TASK?

CAN WE PREDICT STEERING CONTROL PERFORMANCE FROM A 2D SHAPE DETECTION TASK? CAN WE PREDICT STEERING CONTROL PERFORMANCE FROM A 2D SHAPE DETECTION TASK? Bobby Nguyen 1, Yan Zhuo 2 & Rui Ni 1 1 Wichita State University, Wichita, Kansas, USA 2 Institute of Biophysics, Chinese Academy

More information

Key questions about attention

Key questions about attention Key questions about attention How does attention affect behavioral performance? Can attention affect the appearance of things? How does spatial and feature-based attention affect neuronal responses in

More information

Master s Thesis. Presented to. The Faculty of the Graduate School of Arts and Sciences. Brandeis University. Department of Psychology

Master s Thesis. Presented to. The Faculty of the Graduate School of Arts and Sciences. Brandeis University. Department of Psychology Testing the Nature of the Representation for Binocular Rivalry Master s Thesis Presented to The Faculty of the Graduate School of Arts and Sciences Brandeis University Department of Psychology József Fiser,

More information

Systems Neuroscience Oct. 16, Auditory system. http:

Systems Neuroscience Oct. 16, Auditory system. http: Systems Neuroscience Oct. 16, 2018 Auditory system http: www.ini.unizh.ch/~kiper/system_neurosci.html The physics of sound Measuring sound intensity We are sensitive to an enormous range of intensities,

More information

Changing expectations about speed alters perceived motion direction

Changing expectations about speed alters perceived motion direction Current Biology, in press Supplemental Information: Changing expectations about speed alters perceived motion direction Grigorios Sotiropoulos, Aaron R. Seitz, and Peggy Seriès Supplemental Data Detailed

More information

Single cell tuning curves vs population response. Encoding: Summary. Overview of the visual cortex. Overview of the visual cortex

Single cell tuning curves vs population response. Encoding: Summary. Overview of the visual cortex. Overview of the visual cortex Encoding: Summary Spikes are the important signals in the brain. What is still debated is the code: number of spikes, exact spike timing, temporal relationship between neurons activities? Single cell tuning

More information

Auditory modulation of visual apparent motion with short spatial and temporal intervals

Auditory modulation of visual apparent motion with short spatial and temporal intervals Journal of Vision (2010) 10(12):31, 1 13 http://www.journalofvision.org/content/10/12/31 1 Auditory modulation of visual apparent motion with short spatial and temporal intervals Hulusi Kafaligonul Gene

More information

EDGE DETECTION. Edge Detectors. ICS 280: Visual Perception

EDGE DETECTION. Edge Detectors. ICS 280: Visual Perception EDGE DETECTION Edge Detectors Slide 2 Convolution & Feature Detection Slide 3 Finds the slope First derivative Direction dependent Need many edge detectors for all orientation Second order derivatives

More information

Lateral interactions in visual perception of temporal signals: cortical and subcortical components

Lateral interactions in visual perception of temporal signals: cortical and subcortical components PSYCHOLOGY NEUROSCIENCE Psychology & Neuroscience, 2011, 4, 1, 57-65 DOI: 10.3922/j.psns.2011.1.007 Lateral interactions in visual perception of temporal signals: cortical and subcortical components Claudio

More information

Chapter 10 Importance of Visual Cues in Hearing Restoration by Auditory Prosthesis

Chapter 10 Importance of Visual Cues in Hearing Restoration by Auditory Prosthesis Chapter 1 Importance of Visual Cues in Hearing Restoration by uditory Prosthesis Tetsuaki Kawase, Yoko Hori, Takenori Ogawa, Shuichi Sakamoto, Yôiti Suzuki, and Yukio Katori bstract uditory prostheses,

More information

ASSESSING MULTISENSORY INTEGRATION WITH ADDITIVE FACTORS AND FUNCTIONAL MRI

ASSESSING MULTISENSORY INTEGRATION WITH ADDITIVE FACTORS AND FUNCTIONAL MRI ASSESSING MULTISENSORY INTEGRATION WITH ADDITIVE FACTORS AND FUNCTIONAL MRI Thomas W. James, Ryan A. Stevenson and Sunah Kim Department of Psychological and Brain Sciences, Indiana University, 1101 E Tenth

More information

ARTICLE IN PRESS. Available online at Neuroscience Letters xxx (2008) xxx xxx

ARTICLE IN PRESS. Available online at   Neuroscience Letters xxx (2008) xxx xxx Available online at www.sciencedirect.com Neuroscience Letters xxx (2008) xxx xxx Time course of auditory masker effects: Tapping the locus of audiovisual integration? Rike Steenken a,, Adele Diederich

More information

A contrast paradox in stereopsis, motion detection and vernier acuity

A contrast paradox in stereopsis, motion detection and vernier acuity A contrast paradox in stereopsis, motion detection and vernier acuity S. B. Stevenson *, L. K. Cormack Vision Research 40, 2881-2884. (2000) * University of Houston College of Optometry, Houston TX 77204

More information

Shoko Kanaya 1,2,3, Waka Fujisaki 2, Shin ya Nishida 4, Shigeto Furukawa 4, Kazuhiko Yokosawa 1 1

Shoko Kanaya 1,2,3, Waka Fujisaki 2, Shin ya Nishida 4, Shigeto Furukawa 4, Kazuhiko Yokosawa 1 1 Perception, 2015, volume 44, pages 198 214 doi:10.1068/p7753 Effects of frequency separation and diotic/dichotic presentations on the alternation frequency limits in audition derived from a temporal phase

More information

Visual Object Recognition Computational Models and Neurophysiological Mechanisms Neurobiology 130/230. Harvard College/GSAS 78454

Visual Object Recognition Computational Models and Neurophysiological Mechanisms Neurobiology 130/230. Harvard College/GSAS 78454 Visual Object Recognition Computational Models and Neurophysiological Mechanisms Neurobiology 130/230. Harvard College/GSAS 78454 Web site: http://tinyurl.com/visionclass (Class notes, readings, etc) Location:

More information

Perceptual Training Narrows the Temporal Window of Multisensory Binding

Perceptual Training Narrows the Temporal Window of Multisensory Binding The Journal of Neuroscience, September 30, 2009 29(39):12265 12274 12265 Behavioral/Systems/Cognitive Perceptual Training Narrows the Temporal Window of Multisensory Binding Albert R. Powers III, 1,3,4

More information

Language Speech. Speech is the preferred modality for language.

Language Speech. Speech is the preferred modality for language. Language Speech Speech is the preferred modality for language. Outer ear Collects sound waves. The configuration of the outer ear serves to amplify sound, particularly at 2000-5000 Hz, a frequency range

More information