Dynamic Temporal Processing of Multisensory Information

Size: px
Start display at page:

Download "Dynamic Temporal Processing of Multisensory Information"

Transcription

1 Dynamic Temporal Processing of Multisensory Information Dr. Zhuanghua Shi Habilitation an der Fakultät für Psychologie und Pädagogik der Ludwig-Maximilians-Universität München vorgelegt von Dr. Zhuanghua Shi München, den 15 Jan. 2013

2

3 to my family...

4 vi

5 Inhaltsverzeichnis 1 Synopsis Introduction Multisensory spatial integration Multisensory temporal integration Multisensory time perception Sensorimotor recalibration and delay perception Multisensory enhancement and perceptual learning in visual search Multimodal feedback delay in human-machine interaction (HMI) Cumulative research work Multisensory temporal integration and motion perception Multisensory and sensorimotor time perception Multisensory enhancement, context learning and search performance Delays in multimodal feedback and user experience Summary and outlook References Wissenschaftliche Veröffenlichungen List of publications ( ) Part I: Multimodal temporal integration and motion perception Audiovisual Ternus apparent motion Perceptual grouping and crossmodal Apparent motion Auditory capture on Tactile apparent motion Part II: Multimodal time perception Auditory reproduction Feedback delay and duration reproduction Emotional modulation of tactile duration Emotional modulation of audiotactile TOJ Simultaneity in Schizophrenia patients Part III: Multimodal enhancement, perceptual learning and search performance Eye movements and pip-and-pop effect Contextual cueing in multiconjunction search Transfer of contextual cueing in full-icon display remapping

6 viii Inhaltsverzeichnis 2.5 Part IV: Delays in multimodal processing and user experience Neural latencies and motion extrapolation in the central fovea Delay in haptic telepresence systems Effects of packet loss and latency on visual-haptic TOJs Temporal perception of visual-haptic events Delay perception in different haptic environments Optimization for haptic delayed telepresence systems Acknowledgements 255 Lebenslauf 256

7 1 Synopsis 1.1 Introduction Multisensory spatial integration Signals from the natural environment are highly redundant, since we perceive the external world via multiple senses. For example, when knocking on a door we do not only hear a sound, see a hand movement, but also perceive a touch from the knocking hand. The multisensory nature of the world is highly advantageous, because it increases perceptual reliability and saliency, and, as a result, it enhances object discrimination and identification, and facilitates a reaction to the external world (Vroomen & Keetels, 2010). However, the multisensory nature of the world also raises complex integration and segregation problems. For instance, how does our brain sort through relevant and irrelevant signals to form a coherent multisensory perception? Imagine you are chatting with your friends in a coffee bar, you hear multiple voices, and see lip movements simultaneously. You identify and combine those information correctly without any difficulty, but sometimes you may fail to integrate or mis-combine different face and voice together. This happens, too, when you are watching a movie in a cinema. You believe that voices are coming from the actor s lips, although the voice is delivered from the loud speakers on the sidewalls. This is known as atypicalventriloquisteffect. Such an audiovisual speech illusion, however, is one example of multisensory integration, and there are many others. To take another example, when a single flash is accompanied with two beeps, the single flash is often perceived as two flashes (Shams, Kamitani, & Shimojo, 2000). Over the past few decades, much has been gained in multisensory perception, particular in spatial integration. The most common account for multisensory integration is the modality appropriateness or modality precision hypothesis (Welch & Warren, 1980). The hypothesis states that the sensory modality with highest acuity outweighs the others in multisensory integration. For example, vision with its high spatial resolution may dominate over audition for spatial perception, which explains that the position of an auditory stimulus is often captured by a simultaneous visual stimulus. In recent years, probabilistic models, such as Maximum likelihood estimation (MLE) (Alais & Burr, 2004; Ernst & Banks, 2002; Ernst & Bülthoff, 2004),havebeendevelopedtoprovidequantitativeaccounts for multisensory integration. The MLE model assumes that different sensory inputs are assigned to differential weights, with each weight set in proportion to the reliability of the corresponding sensory estimates. Using this approach, the final multisensory estima-

8 2 1. Synopsis te has minimal variance (in other words, maximal reliability). Consider a bimodal source (e.g. an audiovisual signal) that produces two sensory cues (e.g. positional cues), estimated by the auditory and visual systems with (Ŝa, Ŝv). The MLE model predicts that the final audiovisual estimate Ŝ is: Ŝ = w a Ŝ a + w v Ŝ v, (1.1) where w a =1/ 2 a/(1/ 2 a +1/ 2 v),w v =1 w a.and 2 a, 2 v are variances of the auditory and visual sensory estimates (see Figure 1.1). Ŝ Ŝv Ŝa Abbildung 1.1: A normative MLE model of audiovisual integration. The audiovisual estimate Ŝ is the linear combination of Ŝa, Ŝv, with each weight set in proportion to its reliability. The reliability-based integration models have successfully predicted multisensory integration in many situations, such as visual-haptic size estimation, and audio-visual localization (for a review, see Alais, Newell, & Mamassian, 2010). The MLE model has recently been extended to a more general Bayesian framework, in which prior knowledge of the multisensory information has been incorporated (Ernst & Di Luca, 2011; Körding & Wolpert, 2004; Roach, Heron, & McGraw, 2006). Using priors allows Bayesian models to predict both multisensory integration and multisensory segregation (Ernst & Di Luca, 2011; Roach et al., 2006). Note, those quantitative models mentioned above are derived from studies of multisensory spatial integration (Alais & Burr, 2004; Ernst & Banks, 2002). However, evidence is rather mixed whether or not those quantitative models would also apply to multisensory temporal integration (see next section 1.1.2) Multisensory temporal integration A pre-assumption for multisensory integration is the assumption of unity. The assumption of unity suggests that multisensory integration makes sense only when the perceptual system has evidence that the multiple signals (events) originate from a common source (Welch, 1999; Welch & Warren, 1980). Without doubt, the most important factors for perceiving a common source are spatial proximity and temporal coincidence. Multisensory integration occurs when the sensory signals originated from proximal location and reach the

9 1.1 Introduction 3 brain at around the same time. Otherwise, sensory signals are likely perceived as separated events. Similar to multisensory spatial integration, multisensory events with temporal coincidence may interact and integrate, forming a coherent temporal percept. Recent studies on audiovisual interaction in temporal order judgments (Morein-Zamir, Soto-Faraco, & Kingstone, 2003; Scheier, Nijhawan, & Shimojo, 1999) found that the temporal discrimination threshold of visual events can be altered by adding two auditory clicks. When the first click is slightly prior to the first flash and the second click shortly after the second flash, the visual temporal resolution could be enhanced, as if the two clicks pull two flashes apart. This has been termed as the temporal ventriloquist effect, analogue to the spatial ventriloquist effect. Various types of the temporal ventriloquist effect have been found recently using different paradigms (Fendrich & Corballis, 2001; Getzmann, 2007; Keetels, Stekelenburg, &Vroomen,2007).Forexample,Getzmannhasrecentlyusedclassicalapparentmotionto investigate how brief beeps altered visual apparent motion. He found that, similar to the temporal ventriloquist effect, beeps presented before the first and after the second visual flash as well as simultaneously presented sounds reduce the motion impression, whereas sounds intervening two visual flashes facilitated apparent motion relative to the baseline (visual flashes without sounds). The common explanation for multisensory temporal integration is similar to the account for the multisensory spatial integration (i.e., the traditional modality precision hypothesis, Welch&Warren,1980),arguingthattheauditorymodality has higher temporal resolution than other modalities, and as a result, auditory information dominates the final temporal percept. Note, temporal coincidence can be influenced by physical temporal discrepancies (e.g., sound and light travel at different speeds), and by differential neural processing time among sensory modalities. Scientists are well aware of neural latency differences. For example, auditory stimuli are often perceived faster than visual stimuli (Levitin, MacLean, Mathews, & Chu, 2000; Stone et al., 2001), whereas, the latency of touch has to be considered where the stimulation originated, because the travel time is longer from the toes to the brain than from the forehead (Vroomen & Keetels, 2010). Although latencies are different for different modalities, the perceptual system still promotes a temporally coherent and unified perception to a certain degree. It is thus essential for researchers to investigate the compensation mechanisms of the perceptual system. The ability of the perceptual system to compensate for different latencies has been referred to as temporal window of multisensory integration. To date, it has been revealed that the temporal integration window depends on many aspects, such as the modality, training, attentional bias etc. For example, Fujisaki et al. have shown that training and adaptation can alter the crossmodal simultaneity window (Fujisaki, Shimojo, Kashino, & Nishida, 2004). Spence and colleagues have demonstrated that attention can also shift the integration window (Spence, Nicholls, & Driver, 2001). However, most of the aforementioned studies have mainly examined crossmodal integration of the audio-visual modalities. Research of touch-related multisensory temporal integration is relative scarce and the temporal integration mechanism with touch still need to be further investigated. Besides spatial proximity and temporal coincidence, othergestaltgroupingprinciples,

10 4 1. Synopsis such as common fate or common feature, may also lead to a coherent percept. More recently, it has been shown that perceptual grouping in general could be a potential influential factor for multisensory perception (Spence, Sanabria, & Soto-Faraco, 2007). For example, unimodal auditory grouping and segregation (i.e. pop-out pips) can enhance discrimination of concurrent visual events (Van der Burg, Olivers, Bronkhorst, & Theeuwes, 2008; Vroomen & de Gelder, 2000), or temporal order judgments (Keetels et al., 2007). In a study of audiovisual interaction in visual apparent motion, Bruns and Getzmann have revealed that either a continuous sound filling in the gap between two flashes or a short sound intervening between two flashes promotes the crossmodal grouping of movement, which enhances perceived visual motion continuity (Bruns & Getzmann, 2008). However, it is less known how unimodal and crossmodal grouping interact and modulate multisensory temporal integration Multisensory time perception The perception of time, in particular, time among multiple senses, is not straightforward, since there is no special sensory organ for time perception. Yet, the traditional centralized and amodal internal clock model dominated the field of time perception over the last 30 years (Bueti, 2011), which consists of a pacemaker emitting pulses at a certain rate and a mode of switch that can open and close to permit an accumulator to collect emitted pulses. More recently new evidence has been accumulated to challenge the one-centralized-clock amodal model. For instance, the amodal account can not explain differential modalityspecific pacemaker rates (Droit-Volet, Meck, & Penney, 2007; Penney, Gibson, & Meck, 2000; Wearden, Edwards, Fakhri, & Percival, 1998). Neurophysiological evidence, on the other hand, suggests separate brain regions devote to visual and auditory duration processing (Bueti, Bahrami, & Walsh, 2008; Ghose & Maunsell, 2002). The amodal model has also difficulty to explain, for examples, why temporal discrimination is better for audition than vision (Grondin, 1993), and why auditory duration is judged longer than the same physical visual duration (Wearden et al., 1998). Those recent evidence suggests that time perception is rather distributed across brain areas and sensory modalities (Bueti, 2011; Ivry & Richardson, 2002; Matell & Meck, 2004). Since time processing is distributed across modalities, studies on crossmodal time judgments have revealed rather complex and inconclusive results. For instance, it has been shown that the duration of auditory events was lengthened or shortened by the presence of conflicting looming or receding visual information, while the perceived duration of visual events was unaffected by auditory looming or receding stimuli (van Wassenhove, Buonomano, Shimojo, & Shams, 2008). Other studies, on the other hand, using static stimuli or implicit measures have reported the opposite results, that is, the perceived visual duration was affected by a concurrent auditory duration (e.g., Y.-C. Chen & Yeh, 2009). Unlike spatial perception, time perception can be distorted dramatically by emotional stats. For instance, when involved in an accident, such as a car crash, people often report that they felt the world slow down. Research suggests high-arousal stimuli, such as threatening pictures, are often perceived longer in duration compared to neutral stimuli

11 1.1 Introduction 5 (Droit-Volet et al., 2007). The lengthening effect induced by emotion has been confirmed in visual (Angrilli, Cherubini, Pavese, & Manfredini, 1997; Droit-Volet, Brunot, & Niedenthal, 2004), and auditory (Noulhiane, Mella, Samson, Ragot, & Pouthas, 2007) modalities. Although there is now ample evidence of how emotion distorts duration perception, most of the studies have focused only on unisensory modulation. Given that time perception is distributed processing, there is still only scant understanding of how induced emotion from one sensory modality influences time perception in another modality Sensorimotor recalibration and delay perception Time perception can be influenced by an action, too (Bueti & Walsh, 2010; Cunningham, Billock, & Tsou, 2001; Stetson, Cui, Montague, & Eagleman, 2006). Stetson et al. (2006) have demonstrated that, following brief exposure to delayed visual feedback of a voluntary action, the onset time of the action-feedback signal is perceived earlier than the the action itself when the delay is removed. The effect has been attributed to dynamical shifts of the feedback event to the onset of the action, in order to maintain appropriate causality perception. Other related studies have confirmed that a delayed sensory effect is perceived as having appeared slightly earlier in time if it follows a voluntary action - a phenomenon referred to as intentional binding. Intentionalbindingalsoattractsavoluntaryactiontoward its sensory effect, so that the action is perceived as having occurred slightly later in time too, and perceived feedback delay as shorter than the actual delay (Engbert, Wohlschläger, & Haggard, 2008; Engbert, Wohlschläger, Thomas, & Haggard, 2007; Haggard, Clark, & Kalogeras, 2002). The shortening effect has been attributed to a transient slowdown of an internal clock after a voluntary action, and as a result, less ticks are accumulated (Wearden, 2008). This shortening effect might be reinforced by everyday experience which leads us to assume sensorimotor synchrony between the start of a motor action and its sensory consequence (Heron, Hanson, & Whitaker, 2009). However, whether sensorimotor temporal calibration is due to timing changes in the motor system or in the perceptual system is still controversial. Some researchers have suggested that sensorimotor temporal calibration is induced mainly by a temporal shift in the motor system (Sugano, Keetels, & Vroomen, 2010), whereas others have attributed sensorimotor temporal calibration to pure perceptual learning (Kennedy, Buehner, & Rushton, 2009) Multisensory enhancement and perceptual learning in visual search Temporal coinciding multisensory events, such as synchronous audiovisual signals, can easily be picked out by our brain amongst other objects or events in the environment. For example, a car collision with a big Peng easily attracts our attention to the accident spot. Such enhancement may come about as result of redundant target coding and multisensory perceptual saliency. Multisensory enhancement and facilitation have been shown in various search paradigms in which a visual target was accompanied by a sound signal (Bolia, D Angelo, & Richard, 1999; Doyle & Snowden, 1998; Van der Burg, Cass, Olivers,

12 6 1. Synopsis Theeuwes, & Alais, 2010). For example, Doyle & Snowden (1998) found that simultaneous, spatially congruent sound facilitated covert orienting to non-salient visual targets in a conjunction search paradigm. Interestingly, multisensory enhancement of visual perception and search performance has been found not only with spatially informative, but also with temporally informative auditory (Van der Burg et al., 2010; 2008; Vroomen & de Gelder, 2000) or tactile signals (Van der Burg, Olivers, Bronkhorst, & Theeuwes, 2009). For instance, Vroomen & de Gelder (2000) investigated crossmodal influences from the auditory onto the visual modality at an early level of perceptual processing. In their study, a visual target was embedded in a rapidly changing sequence of visual distractors. They found a high tone embedded in a sequence of low tones to improve the detection of a synchronously presented visual target, while this enhancement was reduced or abolished when the high tone was presented asynchronously to the visual target or became part of a melody. Using adynamicvisualsearchparadigm,vanderburgetal.demonstratedthatirrelevantbeeps could guide visual attention towards the location of a synchronized visual target, which, if presented without such synchronous beeps, was extremely hard to find (Van der Burg et al., 2008). With the aid of synchronous beeps, search performance was improved substantially (in fact, in the order of seconds). Van der Burg et al. referred to this facilitation as pip-and-pop effect. However, when synchronized tones were not transient, rather smooth (e.g. by a sine wave enveloping), pip-and-pop effect vanished, suggesting transient feature of the auditory signals is important (Van der Burg et al., 2010). To date, the true underlying mechanisms and linkage between multisensory enhancement and search performance are still not well known, so it deserves further investigation Multimodal feedback delay in human-machine interaction (HMI) Multisensory time processing has critical implication in human-machine interaction, particular in multimodal virtual reality systems. Multimodal virtual reality systems have been adopted in a variety of applications, such as remote virtual conference, telesurgery, teleoperation in space and under water. In a typical multimodal telepresence system, multimodal information are bilateral between the local and remote sites. Users not only receive information from remote sites, but send multimodal commands (e.g. audiovisual stream as well as haptic actions). However, due to the communication distance, data encoding, and control scheme, communication delays between the local and remote sites are inevitable. These delays can vary from dozens of milliseconds to seconds. For instance, the feedback latency for an intercontinental teleoperation via the Internet is on average 300 ms, while the latency can be up to 5-10 seconds for teleoperation tasks in space. In addition, delays may vary among different modalities. Thus, remote multimodal synchronous events, such as visual-haptic collision, may be turned into local asynchronous incidents. And a normal immediate action-effect turns into action-delayed-effect as well. The effect of time delay on simple performance have been investigated in several studies. For example, examining the effect of visual-feedback delay on user s task completion time,

13 1.2 Cumulative research work 7 Mackenzie and Ware found that performance was affected by delays exceeding 75 ms, with completion time thereafter increasing linearly with time delay (> 75 ms) and task difficulty (MacKenzie & Ware, 1993). Similar effects have been confirmed in various modalities, such as delay in visual feedback (Kim, Zimmerman, Wade, & Weiss, 2005), haptic feedback (Ferrell, 1966), as well as visual-haptic feedback (Jay, Glencross, & Hubbold, 2007). While many studies of time delays have examined issues related to task performance, there are relative few studies on delay perception per se in multimodal virtual reality systems. Arguably, knowing human s capabilities of perceiving delays is useful for providing system designers with guidelines for the development of multimodal communication protocols as well as for human-centered evaluations of existing applications with respect to system fidelity and user experience. 1.2 Cumulative research work As alluded above, there are several open key issues in multimodal temporal processing. During my habilitating period, I have focused on the following four research topics: 1. Multisensory temporal integration and motion perception: Usingvariousapparent motion paradigms, studies in this research topic extended previous multisensory temporal integration at point in time to multisensory interval (duration) integration, and revealed that quantitative models, such as MLE, could predict multisensory interval estimation very well. In addition, crossmodal grouping principles on multisensory integration has been extensively investigated. 2. Multisensory time perception: Inthistopic,variousstudieshavebeenconducted on multisensory duration perception, particularly on issues of sensorimotor duration perception and crossmodal emotional modulation on time perception. 3. Multisensory enhancement, context learning and search performance: Inthe third line of research, studies have been focused on how audiovisual synchronous events and contextual cueing boost visual search performance. Eye tracking method has been applied in the studies to reveal how synchronous audiovisual events influence oculomotor behaviors. In addition, context learning in general has been examined. 4. Multimodal feedback delay and user experience: Feedbackdelayisubiquitous in applied multimodal systems, such as telepresence, involving large data transmission. The influence of delay on multisensory perception and user experience is the main focus of this last research agenda. Here various studies have been conduced to identify the impacts of delays in visual-haptic environments on perception of multisensory simultaneity and user s operation performance. Based on fundamental findings, performance optimization methods have been proposed.

14 8 1. Synopsis Multisensory temporal integration and motion perception Most studies on multisensory temporal integration follow the traditional approach of the multisensory spatial integration (such as spatial ventriloquist effect), focusing on crossmodal temporal capture at a point in time (e.g. temporal ventriloquist effect). The common finding is that the onset time of a visual event is perceived to be aligned with the onset of an auditory event which appears temporally near the visual event (Burr, Banks, & Morrone, 2009; Freeman & Driver, 2008; Getzmann, 2007; Morein-Zamir et al., 2003; Scheier et al., 1999). However, the temporal ventriloquist effect is manifested only with paired audiovisual stimuli. Several studies have shown that a single sound leaves visual temporal-order judgment (TOJ) uninfluenced (Morein-Zamir et al., 2003; Scheier et al., 1999). This has been taken to suggest that two sounds are required for the audiovisual stimuli to be perceived as unitary events. Arguably, however, two beeps clearly define an auditory interval, which - in contrast to the point in time - is another feature of the time perception. Moreover, paired stimuli can easily form a perceptual group, which may further influence on multisensory temporal integration. To investigate the influence of a sound interval on audiovisual temporal integration, we 1 adopted a Ternus apparent motion paradigm (Shi, Chen, & Müller, 2010). The typical Ternus apparent motion is produced by presenting two sequential visual frames; each frame consists of two horizontal dots, and the two frames, when overlaid, share one common dot at the center. Observers typically report two distinct percepts dependent on the interstimulus onset interval (ISOI): element motion and group motion. ShortISOIsusuallygive rise to the percept of element motion, that is: the outer dots are perceived as moving, while the center dot appears to remain static or flashing. By contrast, long ISOIs give rise to the perception of group motion: the two dots are perceived to move together as a group (See Figure 1.2). The transition threshold between the element motion and group motion, measured by the chance level of two alternative force choices (2AFC), is relative stable when the spatial configuration is fixed. (a) Space (b) Space Time Time Abbildung 1.2: Schematic representation of the Ternus apparent motion. (a) Element motion percept. (b) Group motion percept. Using Ternus apparent motion we could implicitly measure audiovisual duration integration by observing the shifts of the transition thresholds. In the study (Shi et al., 1 In most studies I have collaborated with my colleagues and doctoral students. Thus, I prefer the word we to I in the report. Other times, I use the words we and you to refer a generic third person. It should be clear from the context.

15 1.2 Cumulative research work ), we systematically investigated influences of paired beeps and a single beep with three different audiovisual temporal configurations. In the paired-beeps conditions, auditory gap intervals were clearly defined. Similarly to previous temporal ventriloquist studies (Morein-Zamir et al., 2003; Scheier et al., 1999), we found audiovisual interval capture effects. When the first sound preceded the first visual frame and the second sound trailed the second visual frame by 30 ms, more group motion responses were observed compared to the baseline condition - two sounds presented synchronously with the visual frames. The opposite effect was also found when two sounds were presented in-between two visual frames (see Figure 1.3). However, such audiovisual capture effects were almost gone when one beep was removed (either the first or the second, see Figure 1.4), which strongly suggested that the auditory interval is a critical key factor in the audiovisual temporal integration. Further experiments quantified such audiovisual interval integration using direct audiovisual interval comparisons. Auditory intervals were typically perceived longer than visual intervals with the same physical length. The perceived audiovisual interval was predicted by MLE model, indicating auditory and visual intervals are integrated in an optimal way in terms of variability 1.1. (a) Proportion of "group motion" responses Outer sounds Synch. sounds Inner sounds ISOI (ms) Abbildung 1.3: Psychometric curves fitted for paired-beeps conditions. The solid curve and circles represent the the baseline synchronous-sounds condition, the dashed curve and crosses the outer-sounds condition, and the dashed-dotted curve and pluses the innersounds condition. In another study (Chen, Shi, & Müller, 2010), we examined how perceptual grouping in general influences crossmodal temporal processing using the same Ternus apparent motion paradigm. Instead of using audiovisual modalities, in this study we used visual and tactile Ternus apparent motion, given that we intended to examine bidirectional interactions and Ternus apparent motion can only be constructed in visual or tactile modality. The tactile Ternus apparent motion was created by three tactile solenoids, which would tap the three fingers to induce indentation taps. The visual apparent motion was constructed by three LEDs near the three solenoids. In the study, we introduced intra and cross-modal temporal grouping of the middle element (either tactile or visual) by presenting the middle element twice prior to the Ternus display. We found that intramodal grouping of the

16 10 1. Synopsis (a) Proportion of "group motion" responses ms 0 ms 30 ms TVEs (ms) ms 30 ms ISOI (ms) (b) Abbildung 1.4: Psychometric curves fitted for single-beep conditions. The solid curve and circles represent the baseline synchronous-sound condition, the dashed curve and crosses the preceding-sound condition (audiovisual interval 30 ms), and the dash-dotted curve and pluses the trailing-sound condition (audiovisual interval -30 ms). The magnitude of the temporal ventriloquist effects (TVEs), calculated against the baseline, is presented in asubplotforthe preceding-sound (30ms)and trailing-sound conditions(-30ms). 5 0 middle element with rhythmic or short precue intervals biased Ternus apparent motion toward element motion, whereas there was no effect of crossmodal grouping on Ternus apparent motion with same temporal settings. This indicated intramodal temporal grouping promotes the saliency of the middle element, which leads to more element motion percept in responses. However, such the effect was relative weak to be manifested for crossmodal temporal grouping conditions. Along this line of research, we further investigated the influences of the crossmodal timing and the event structure on intra- and cross-modal perceptual grouping (Chen, Shi, & Müller, 2011). In the study we used bi-stable two-tap tactile apparent motion streams. Since the two tactile taps were repeatedly presented with same inter-stimulus interval, the leftward and rightward motion percepts were bi-stable, that is, two mutually exclusive perceptual states switched equally and unpredictably. During the 90-second tactile motion stream, mono beeps were added and paired with tactile taps using various temporal asynchronies. When each tactile tap was paired with one beep, we found a typical temporal ventriloquist effect, as we found earlier (Shi et al., 2010). That is, auditory intervals captures paired tactile intervals. As a result, two taps with perceived short audiotactile interval were grouped together, forming a dominant tactile motion percept. However, when only half of the taps (e.g. odd-numbered taps) were paired with beeps, modulation of audiotactile temporal asynchronies was diminished. Instead of a temporal capture effect, a dominant motion percept from the audiotactile side to the tactile-only side was observed independently of the crossmodal asynchrony variation. This was mainly due to strong attentional bias towards the side of the crossmodal grouping, giving rise to apparent tactile motion from the side of the audiotactile grouping to the other side. Taken these studies together, we have a clear view on how crossmodal interval and

17 1.2 Cumulative research work 11 perceptual grouping influence on multisensory temporal integration. The temporal ventriloquist effect has been manifested repeatedly for full paired crossmodal stimuli. Convergent evidence suggests that crossmodal interval/duration integration is one important factor for the temporal ventriloquist effect. On the other hand, when the crossmodal stimuli are unequally paired, perceptual grouping (either intra- or cross-modal grouping ) may first be processed, which leads to dynamic attention shifts and bias the motion percept Multisensory and sensorimotor time perception Although distributed models of time perception have been gradually accepted in multisensory time research (Bueti, 2011; Buhusi & Meck, 2005), it is still controversial how the distributed (or modality-specific) timing is integrated together. Distributed timing processes may cause differences between action and perception time, which has been sparsely mentioned in the literature. For example, Walker and Scott once found that motor reproduction relying only on kinesthetic information (i.e. action timing) was overestimated by about 12 percent for an auditory standard duration (Walker & Scott, 1981). In a recent study (Bueti & Walsh, 2010), an action task, where participants reproduced an auditory or visual duration by pressing a button, was compared to a perceptual task, where participants stopped the compared signal when its perceived duration reached the same amount of time as the standard duration. The action timing was strongly overestimated for short durations and underestimated for long duration. Some other studies also demonstrated that the perceived time of a second presented immediately after a saccade or arm movement is often perceived longer than subsequent seconds (but see Binda, Cicchini, Burr, & Morrone, 2009; Park, Schlag-Rey, & Schlag, 2003; Yarrow, Haggard, Heal, Brown, & Rothwell, 2001). Given that action and perceived time is far from veridical and time estimation can be easily biased by various factors, our brain must encounter challenges to integrate various sources of temporal information to enable accurate timing for a multisensory or sensorimotor event. In a recent study (Shi, Ganzenmüller, & Müller, 2013), we investigated this issue using three different duration estimation tasks: auditory duration comparison, motor reproduction, and auditory reproduction. Auditory duration comparison and motor reproduction tasks aimed to measure perceptual and action time processing, whereas the auditory reproduction task was a bimodal (i.e. perceptual and motor) task, which aimed to find how perceptual and action durations are integrated together. We measured estimation variability for all three different tasks. In the spatial domain, reliability-based optimal integration models, such as MLE (1.1), have successfully predicted the effects of multimodal integration in various cases, such as visual-haptic size estimation, audiovisual localization etc. (for a review, see Ernst & Di Luca, 2011). In one of our previous studies using implicit measure (Shi et al., 2010), we also found that the MLE model predict audiovisual duration integration well. We further tested the reliability-based integration model for sensorimotor temporal integration (Shi et al., 2013), particular for auditory reproduction. In contrast to the previous approach using the implicit assumption of unbiased estimates, 2 we explicitly 2 For Bayesian integration models, disregarding biases allows one to focus on minimizing variance as an

18 12 1. Synopsis introduced biases in the quantitative model. Suppose there is a standard auditory duration D s.anauditoryestimate ˆD a,derivedfromadurationcomparisontask,maycontainabias a. Apuremotorreproduction,ontheotherhand,mayleadtoadifferent estimate ˆD r, containing a different bias r. That is, E( ˆD a )=D s + E( a), (1.2) E( ˆD r )=D s + E( r), (1.3) where E( ) is the expectation function. In auditory reproduction, both perceptual auditory comparison and motor reproduction are present. Suppose perceptual and motor estimates are independent of each other, the maximum likelihood prediction of the auditory reproduction is given by the following: E( ˆD ar )=D s + w a E( a)+w r E( r), (1.4) where w a and w r are the weights of perceptual and motor estimates. According to MLE, the optimal weights should be inversely proportional to the correspondent variances, 1/ 2 a w a =, (1.5) 1/ a 2 +1/ r 2 w r =1 w a. (1.6) If the optimal weighting rule is followed, the variance for the auditory reproduction ar 2 2 should also be lower than the variances of the pure perceptual and motor estimates, a and r. 2 Using one second auditory intervals as a standard stimuli in three different duration tasks, we confirmed the previous finding of overestimation in motor reproduction (Walker &Scott,1981).Inourcase,themotorreproductionproducedabout40%overestimates, whereas the auditory comparison task provided a relative precise estimation (Figure 1.5). We further compared reliability-based MLE predictions and observed behavioral results, and found the prediction of the MLE model was relative high for the observed auditory reproduction (r =0.62) andvariability(r =0.68). Similar conclusions were further confirmed by a subsequent experiment with varied standard durations and varied signal-noise ratios (SNRs) in the compared/reproduced tones (Figure 1.6, r =0.81.). The MLE prediction on sensorimotor duration reproduction was proved to be far better than either a motor or a perceptual dominance model. However, turning to the variability of the bimodal condition, the MLE model turned out to be an suboptimal model, that is, not showing the theoretical improvement. Interestingly, though, it confirmed our previous findings (Shi et al., 2010) and other recent studies (Burr et al., 2009; Hartcher-O Brien & Alais, 2011; Tomassini, Gori, Burr, Sandini, & Morrone, 2011). optimality criterion. In some studies (e.g. Burr et al., 2009), biases are assumed to be constant across all conditions.

19 1.2 Cumulative research work 13 means (ms) Motor Rep. Comparison Auditory Rep. 0 Bias SD Abbildung 1.5: Mean estimation biases and standard deviations (SDs) with ±1 standard error bars for 1-second duration estimation in three different tasks. That is, the variability in crossmodal temporal integration is often found to be suboptimal. The reason of this suboptimal integration is not clear at present. It has been suggested that the assumption of Gaussian noise might not be appropriate for timing tasks (Burr et al., 2009). Alternatively, additional decision noise may be introduced in the bimodal (or sensorimotor) task owing to multiple information and increased task difficulty. It is also possible that time estimates from different sensory (motor) modalities are not independently distributed, but partially dependent, as hinted by the literature of the amodal internal clock model. When sensory estimates are correlated, it has been shown that the true optimal weights and reliability could dramatically deviate from independent optimal integration (Oruç, Maloney, & Landy, 2003). A Mean estimation biases (ms) M. Rep. Comp. A. Rep. H/800 L/800 H/1200 L/1200 B Observed reproduction Predicted reproduction Abbildung 1.6: A. Mean estimation biases (with ±1 standard error bars) as a function of standard duration and SNR. H and L denote high and low SNRs, 800 and 1200 short and long standard in ms. B. Observedreproductionsplottedagainstpredictedreproductions. The solid red line is a linear regression of the data (y = x). The dot-dashed line indicates ideal optimal cue integration based on MLE. The green and blue crosses represent data from high and low SNR conditions respectively. In addition to the feedback information, feedback delay itself can influence the duration

20 14 1. Synopsis reproduction, too. In another recent study (Ganzenmüller, Shi, & Müller, 2012), we investigated this issue by injecting an onset- or offset-delay to the sensory feedback signal from adurationreproductiontask.wefoundthatthereproduceddurationwaslengthened,and the lengthening effect was observed immediately, on the first trial with the onset-delay. In contrast, a shortening effect was found with feedback signal offset-delay, though the effect was weaker and merely manifested partially in the auditory reproduction, not in the visual reproduction. The offset of reproduction much relied on the action stop signal. The findings suggest that the reproduction task with feedbacks integrates both perceptual and action time, but relies differentially on the onset of the feedback signals and the motorstop signals. Such differential binding may well relate to the memory-mixing model (Gu & Meck, 2011). Due to limited capacity of working memory and the cause-effect relationship, motor timing, and caused -feedback timing may share the same representation, which pulls both onsets closer. In the study (Ganzenmüller et al., 2012), we further confirmed strong overestimation in the auditory reproduction as shown in other studies (Bueti & Walsh, 2010; Shi et al., 2013; Walker & Scott, 1981). Overestimation of duration can also be induced by emotional states. For example, threatening pictures (Droit-Volet et al., 2004) or angry faces (Droit-Volet et al., 2007) are often judged as longer than neutral stimuli. However, most evidence of emotional distortion of time perception has been gained with unisensory modulation only. Given that time processing is distributed (for a review, see Bueti, 2011), there is no guarantee that introducing emotional stimuli in one modality could influence on the time perception of stimuli from another modality. On the other hand, emotional states may increase general arousal level, and/or bias the crossmodal linkage and perception-action associations, which may in turn influence duration judgments in other modalities. Recently we investigated this issue using a visual-tactile approach (Shi, Jia, & Müller, 2012). We compared modulation induced by three types of emotional pictures (threat, disgust, and neutral) on the subsequent judgment of vibrotactile duration. The results revealed that the processing of threatening pictures to lengthen, relative to the neutral baseline, subsequent judgments of tactile duration. However, there was no evidence of the lengthening effects using disgust pictures. This clearly rejected the hypothesis of a general arousal as a determine factor. We further examined how visual threat influences tactile time processing. If only the pacemaker of the tactile clock was sped up, we should observe a slope effect using short and long range intervals (Wearden, 2008), that is, larger difference between the threat and neutral conditions in the long interval condition than the short interval condition. However, this was not the case. Further experiments revealed that emotional activation is followed by emotional regulation. When participants were exposed to threatening pictures, attentional resources was first rapidly directed to the defensive system, including the somatosensory system, for preparing areaction.asaresult,thetactiletimeprocessingisdilated.whilethesamewouldapply to the long interval condition, participants eventually realized that the tactile stimuli was not a threat event. Accordingly, attentional resources would be increasingly redirected to processes of emotional regulation. As a consequences, the lengthening effect disappeared. High-arousal emotional state not only dilates the duration perception, but prioritizes the crossmodal temporal processing, as shown in one of our new study (Jia, Shi, Zang, &

21 1.2 Cumulative research work 15 Müller, 2013). In the study, participants were asked to make temporal order judgments (TOJs) to a pair of audiotactile stimuli while gazing at a concurrently presented emotion picture. When the audiotactile stimuli were presented separately on the left and right sides, a significant temporal bias toward the tactile modality was found when the picture had negative meaning (e.g. threat). This finding confirmed our previous conclusion (Shi et al., 2012) that visual-tactile linkage in emotional association is more likely to direct attention toward the tactile than auditory modality. Interestingly, when audiotactile stimuli originated from the same location, there was no such emotional modulation of modalityorientated attention. This suggests that the unity assumption (Welch & Warren, 1980) in crossmodal integration, that is, multisensory stimuli that come from the same origin is likely to be integrated as one single multisensory object than two distal signals, could counteract the otherwise ensuing modality-oriented attentional bias Multisensory enhancement, context learning and search performance It is known that detection of a spatio-temporal coinciding multisensory signal is faster than each of the corresponding signals presented separatly. Recently studies by van der Burg and colleagues revealed an interesting phenomenon, the pip and pop effect, which showed that spatial uninformative but temporal informative beeps could facilitate search performance (Van der Burg et al., 2010; 2008). In their paradigm, participants had to search for ahorizontalorverticalbaramongobliquedistractors.boththetargetanddistractorswere either green or red, and changed their color randomly. Thus the search task was extremely difficult (see Figure 1.7). When color changes of the target were accompanied by synchronous beeps, however, the search performance was boosted in the order of seconds. Van der Burg and colleagues argued that enhanced performance was due to bottom-up audiovisual integration and saliency-boosting. In contrast, other literature (Colonius & Arndt, 2001; Doyle & Snowden, 1998) showed that performance enhancement by audiovisual integration was typically around 100 ms, way less than the reported pip and pop effect. To further examine the effects of spatially uninformative sound on visual search and the underlying mechanisms, we recently adopted the pip-and-pop paradigm (Van der Burg et al., 2008) and measured eye movements (Zou, Müller, & Shi, 2012). In addition to the auditory synchronous cues, we introduced an informative spatial (central arrow) cue as a top-down attentional guidance and a target-absent condition in a separated experiment. If the pip-and-pop effect is pure bottom-up crossmodal enhancement, we should observe no interaction with top-down precue manipulation, as well as no facilitation in the visual target absent condition given that no crossmodal integration would happen. Our study replicated the pip and pop effect. More interestingly, the effect was not purely bottom-up, as we found interaction between top-down precue and sound presence (Figure 1.8, Left). In addition, detection was also facilitated with the presence of the beeps when the target was absent (Figure 1.8, Right). The behavioral results indicated that some top-down strategies must have been adopted by participants. Further eye movement data showed that mean fixation

22 16 1. Synopsis Abbildung 1.7: An example search display used in pip-and-pop search paradigm. Displays contained multiple bars of different orientations, and observers had to detect the target orientation (or the target presence, in one of our experiment). There was a repeating alteration of the display items color, occurring at random time intervals. The onsets of the color changes were accompanied by mono beeps. was longer in the sound-present than -absent condition (see Figure 1.9). In particular, the fixation duration was extended when the beep occurred during the fixation and the amplitude of the immediately following saccade was increased. Eye movement patterns revealed that participants tended to fixate longer when additional sounds were presented, permitting temporally and spatially expanded information sampling and improving the registration of singleton color changes and thus guiding the next saccade more precisely and efficiently to the target. The study demonstrated that temporal coincident audiovisual events not only show perceptual enhancement, but also influence oculomotor behavior and boost performance. Besides multisensory enhancement, learning of spatial context could also facilitate search performance. In one of our recent studies (Geyer, Shi, & Müller, 2010), a contextual cueing paradigm with multiconjuction visual search was used. We confirmed a robust contextual cueing, that is, target presence was discerned more rapidly when the target was embedded in a predictive compared to a non-predictive configuration. Further, contextual cueing was larger when only subset of configuration containing the target, compared to the other configurations, was predictive. In addition, contextual cueing was larger when a predictive display was repeatedly shown across two successive trials. These findings reveal the importance of spatial contextual learning for the guidance of visual search. In another recent study (Shi, Zang, Jia, Geyer, & Müller, 2013), we applied a similar contextual cueing paradigm to a mobile user interface, examining icon re-configurations during display model switch in touch-based mobile devices. In most current devices, icons are shuffled in a positional order when the display mode is changed (e.g., from the portrait to landscape

23 1.2 Cumulative research work 17 Abbildung 1.8: Left: Mean reaction time (±SE) in seconds as a function of cue validity and sound presence. Right: mean reaction time (±SE) insecondsasafunctionoftarget presence, for sound-present (stars) and sound-absent conditions (squares), respectively. mode). Such remapping disrupts the spatial relationships among icons (see Figure 1.10). We tested several novel display remapping methods: position-order invariant (a traditional icon-shuffle method), global rotation (rotating the whole display), local invariant (preserving local regions), and central invariant (preserving the central maximal square region). We found that when using the local-invariant or central-invariant remapping methods, contextual cueing was preserved after the display was changed, indicating performance benefits in the icon localization task. The global-rotation method is intuitive for users, however, in the present study, using desktop monitor to simulate the mobile device, it might introduce additional mental rotation that was detrimental to search performance. The findings thus provide new guidelines for novel interface design of icons rearrangement in mobile devices Delays in multimodal feedback and user experience Delay is ubiquitous in signal transmission and processing. The neural processing, for example, takes some time to convey the sensory information to the brain. For instance, signals from the human retina to the visual cortex requires about 70 to 100 ms (Schmolesky et al., 1998). Other modalities have similar neural latencies. Given that delay is not negligible, one challenge faced in everyday environment for our visual system is the veridical spatiotemporal representation of moving objects. A fast moving object would introduce a spatial lag if the latency (about 100 ms) was not compensated. A typical visual illusion induced by the neural transmission delay is the flash-lag effect, that is, a moving object appears to be ahead of a spatial aligned flashed object. The initial hypothesis proposed by Nijhawan (1994) suggested that the position of the moving object is extrapolated forward to compensate for neural delays in the visual pathway so the object s perceived position is closer to the object s true instantaneous location. Since then, other alternative accounts have been proposed, such as differential latency, attention shift hypotheses, postdiction

24 18 1. Synopsis Abbildung 1.9: (a) Mean fixation duration (±SE) in milliseconds as a function of target presence (present, absent), for sound-present (stars) and -absent conditions (squares), respectively. (b) Mean fixation duration (±SE) in milliseconds as a function of target presence (present, absent), separately for fixations on sound-absent trials (squares), and for fixations with (stars) and, respectively, without beep (diamonds) on sound-present trials. (c) Mean number of fixation (±SE) asafunctionoftargetpresence(present,absent),for sound-present (stars) and -absent conditions (squares), respectively. (d) Mean saccade amplitude (±SE)in degrees of visual angle as a function of target presence(present,absent), for sound-present (stars) and -absent (squares) conditions, respectively.

25 1.2 Cumulative research work 19 (a) (b) Abbildung 1.10: Mockup displays for a mobile device. When the display mode is changed, icons are shuffled and the spatial relationships among icons are partially destroyed. etc. (Baldo & Klein, 1995; Eagleman, 2000; Whitney & Murakami, 1998), to explain the flash-lag effect. The major difference between the extrapolation account (Nijhawan, 1994) and others is that other hypotheses simply deny low-level compensation mechanism, since such low-level extrapolation is hard to observe directly. In a recent study (Shi & Nijhawan, 2012), we directly tested the extrapolation hypothesis using a novel approach, that is, using the nature of two foveal scotomas (i.e., scotoma to dim light and scotoma to blue light) to trigger motion extrapolation. In the central fovea there is a rod-free area about 0.3 diameter, where the low intensity objects fail to yield a visual percept (Hecht, 2002). If the motion percept follows faithfully to the retinotopic map, one should observe a discontinuous movement at the boundary of the fovea when the dim object moves across the fovea (see Figure 1.11, left). However, the forward shifts should be observed if there is motion extrapolation owing to a compensation mechanism in the visual pathway, even though there is no physical response in the central fovea (see Figure 1.11, right). Indeed, our behavioral experiments provide solid evidence supporting the original motion extrapolation account (see Figure 1.12). Perceiving time delay and crossmodal asynchrony also exists for external synchronous multisensory events. Sound, for example, travel through air much slower than light. Thus, we hear a thunder several seconds later than the flash. Even if a light stimulates the retina and a sound pushes the eardrum at the same time, brain activation occurs roughly ms earlier for the auditory signal (Fujisaki et al., 2004). To have coherent perception of the external world and precise sensorimotor interaction with the environment, our brain must compensate latencies and adjust multisensory temporal perception accordingly. In a number of recent studies (Rank, Shi, Müller, & Hirche, 2010; 2010; Shi, Zou, & Müller, 2010; Shi et al., 2010), we have investigated various delayed multimodal feedback in multimodal telepresence systems and gained better understanding how multimodal de-

26 20 1. Synopsis Abbildung 1.11: Left: A dim object moves across the fovea. If there is no extrapolation mechanism in the visual pathway, the motion percept should follow faithfully to the retinotopic map. Right: A dim object moves across the fovea. Owing to the extrapolation mechanism in the visual pathway, the moving object is still perceived within the rod-free fovea, and reappears further away from fovea due to the neural transmission delay. Abbildung 1.12: Results of Experiment 1 from Shi & Nijhawan (2012). Left: Individual thresholds of participants for three conditions. The left arrows denote the perceived vanishing positions in the motion-terminated condition; the right arrows denote the perceived initial positions in the motion-initiated condition; the gray bars denote the thresholds (50%) of motion visibility at cd/m 2. Right: Mean forward shifts in the motion-initiated and motion-terminated conditions (±SE, n = 6). The vertical dot-dashed line denotes the mean radius of the relatively insensitive fovea centralis.

Auditory temporal modulation of the visual Ternus effect: the influence of time interval

Auditory temporal modulation of the visual Ternus effect: the influence of time interval DOI 10.1007/s00221-010-2286-3 RESEARCH ARTICLE Auditory temporal modulation of the visual Ternus effect: the influence of time interval Zhuanghua Shi Lihan Chen Hermann J. Müller Received: 6 August 2009

More information

Multimodal interactions: visual-auditory

Multimodal interactions: visual-auditory 1 Multimodal interactions: visual-auditory Imagine that you are watching a game of tennis on television and someone accidentally mutes the sound. You will probably notice that following the game becomes

More information

Is Clock Slowing General or Specific?

Is Clock Slowing General or Specific? Temporal Binding and Internal Clocks: Is Clock Slowing General or Specific? Richard Fereday (FeredayR@cardiff.ac.uk), Marc J. Buehner (BuehnerM@cardiff.ac.uk) Cardiff University, School of Psychology,

More information

Quantifying temporal ventriloquism in audiovisual synchrony perception

Quantifying temporal ventriloquism in audiovisual synchrony perception Atten Percept Psychophys (2013) 75:1583 1599 DOI 10.3758/s13414-013-0511-4 Quantifying temporal ventriloquism in audiovisual synchrony perception Irene A. Kuling & Armin Kohlrausch & James F. Juola Published

More information

Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities

Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities Exp Brain Res (2010) 201:393 399 DOI 10.1007/s00221-009-2047-3 RESEARCH ARTICLE Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities Yoshimori Sugano Mirjam Keetels Jean

More information

Effects of Auditory Input on a Spatial Serial Response Time Task

Effects of Auditory Input on a Spatial Serial Response Time Task Effects of Auditory Input on a Spatial Serial Response Time Task Christopher W. Robinson (robinson.777@osu.edu) Department of Psychology, The Ohio State University at Newark 1179 University Dr, Newark,

More information

A Multimodal Paradigm for Investigating the Perisaccadic Temporal Inversion Effect in Vision

A Multimodal Paradigm for Investigating the Perisaccadic Temporal Inversion Effect in Vision A Multimodal Paradigm for Investigating the Perisaccadic Temporal Inversion Effect in Vision Leo G. Trottier (leo@cogsci.ucsd.edu) Virginia R. de Sa (desa@cogsci.ucsd.edu) Department of Cognitive Science,

More information

Quantifying temporal ventriloquism in audio-visual synchrony perception

Quantifying temporal ventriloquism in audio-visual synchrony perception Quantifying temporal ventriloquism in audio-visual synchrony perception Irene A. Kuling 1, James F. Juola 2,3 & Armin Kohlrausch 2,4 1 VU University Amsterdam 2 Eindhoven University of Technology 3 University

More information

Influences of intra- and crossmodal grouping on visual and

Influences of intra- and crossmodal grouping on visual and Influences of intra- and crossmodal grouping on visual and tactile Ternus apparent motion Lihan Chen 1,2, Zhuanghua Shi 2, Hermann J. Müller 2,3 1. Department of Psychology, Peking University, 100871,

More information

Recalibration of temporal order perception by exposure to audio-visual asynchrony Vroomen, Jean; Keetels, Mirjam; de Gelder, Bea; Bertelson, P.

Recalibration of temporal order perception by exposure to audio-visual asynchrony Vroomen, Jean; Keetels, Mirjam; de Gelder, Bea; Bertelson, P. Tilburg University Recalibration of temporal order perception by exposure to audio-visual asynchrony Vroomen, Jean; Keetels, Mirjam; de Gelder, Bea; Bertelson, P. Published in: Cognitive Brain Research

More information

Distortions of Subjective Time Perception Within and Across Senses

Distortions of Subjective Time Perception Within and Across Senses Distortions of Subjective Time Perception Within and Across Senses Virginie van Wassenhove 1 *, Dean V. Buonomano 2,3, Shinsuke Shimojo 1, Ladan Shams 2 1 Division of Biology, California Institute of Technology,

More information

The effects of subthreshold synchrony on the perception of simultaneity. Ludwig-Maximilians-Universität Leopoldstr 13 D München/Munich, Germany

The effects of subthreshold synchrony on the perception of simultaneity. Ludwig-Maximilians-Universität Leopoldstr 13 D München/Munich, Germany The effects of subthreshold synchrony on the perception of simultaneity 1,2 Mark A. Elliott, 2 Zhuanghua Shi & 2,3 Fatma Sürer 1 Department of Psychology National University of Ireland Galway, Ireland.

More information

Recent studies have shown that the brain can adjust the processing

Recent studies have shown that the brain can adjust the processing Adaptation to audiovisual asynchrony modulates the speeded detection of sound Jordi Navarra a,b,1, Jessica Hartcher-O Brien b, Elise Piazza b,c, and Charles Spence b a Fundació Sant Joan de Déu, Hospital

More information

Modality Differences in Timing: Testing the Pacemaker Speed Explanation

Modality Differences in Timing: Testing the Pacemaker Speed Explanation Modality Differences in Timing: Testing the Pacemaker Speed Explanation Emily A. Williams (Emily.A.Williams@Manchester.ac.uk) Andrew J. Stewart (Andrew.J.Stewart@Manchester.ac.uk) Luke A. Jones (Luke.Jones@Manchester.ac.uk)

More information

Lihan Chen. Crossmodal Temporal Capture in Visual and Tactile Apparent Motion: Influences of temporal structure and crossmodal grouping

Lihan Chen. Crossmodal Temporal Capture in Visual and Tactile Apparent Motion: Influences of temporal structure and crossmodal grouping Lihan Chen Crossmodal Temporal Capture in Visual and Tactile Apparent Motion: Influences of temporal structure and crossmodal grouping München 2009 Crossmodal Temporal Capture in Visual and Tactile Apparent

More information

New insights in audio-visual synchrony perception

New insights in audio-visual synchrony perception Eindhoven, April 14th, 2011 New insights in audio-visual synchrony perception by Irene Kuling identity number 0667797 in partial fulfilment of the requirements for the degree of Master of Science in Human-Technology

More information

Direction of visual apparent motion driven by perceptual organization of cross-modal signals

Direction of visual apparent motion driven by perceptual organization of cross-modal signals Journal of Vision (2013) 13(1):6, 1 13 http://www.journalofvision.org/content/13/1/6 1 Direction of visual apparent motion driven by perceptual organization of cross-modal signals NTT Communication Science

More information

Coordination in Sensory Integration

Coordination in Sensory Integration 15 Coordination in Sensory Integration Jochen Triesch, Constantin Rothkopf, and Thomas Weisswange Abstract Effective perception requires the integration of many noisy and ambiguous sensory signals across

More information

Audio-Visual Speech Timing Sensitivity Is Enhanced in Cluttered Conditions

Audio-Visual Speech Timing Sensitivity Is Enhanced in Cluttered Conditions Audio-Visual Speech Timing Sensitivity Is Enhanced in Cluttered Conditions Warrick Roseboom 1 *, Shin ya Nishida 2, Waka Fujisaki 3, Derek H. Arnold 1 1 School of Psychology, The University of Queensland,

More information

Validity of Haptic Cues and Its Effect on Priming Visual Spatial Attention

Validity of Haptic Cues and Its Effect on Priming Visual Spatial Attention Validity of Haptic Cues and Its Effect on Priming Visual Spatial Attention J. Jay Young & Hong Z. Tan Haptic Interface Research Laboratory Purdue University 1285 EE Building West Lafayette, IN 47907 {youngj,

More information

A model of parallel time estimation

A model of parallel time estimation A model of parallel time estimation Hedderik van Rijn 1 and Niels Taatgen 1,2 1 Department of Artificial Intelligence, University of Groningen Grote Kruisstraat 2/1, 9712 TS Groningen 2 Department of Psychology,

More information

Counting visual and tactile events: The effect of attention on multisensory integration

Counting visual and tactile events: The effect of attention on multisensory integration Attention, Perception, & Psychophysics 2009, 71 (8), 1854-1861 doi:10.3758/app.71.8.1854 Counting visual and tactile events: The effect of attention on multisensory integration PETER J. WERKHOVEN Utrecht

More information

Tilburg University. Auditory grouping occurs prior to intersensory pairing Keetels, Mirjam; Stekelenburg, Jeroen; Vroomen, Jean

Tilburg University. Auditory grouping occurs prior to intersensory pairing Keetels, Mirjam; Stekelenburg, Jeroen; Vroomen, Jean Tilburg University Auditory grouping occurs prior to intersensory pairing Keetels, Mirjam; Stekelenburg, Jeroen; Vroomen, Jean Published in: Experimental Brain Research Document version: Publisher's PDF,

More information

Bayesian Inference Explains Perception of Unity and Ventriloquism Aftereffect: Identification of Common Sources of Audiovisual Stimuli

Bayesian Inference Explains Perception of Unity and Ventriloquism Aftereffect: Identification of Common Sources of Audiovisual Stimuli LETTER Communicated by Robert A. Jacobs Bayesian Inference Explains Perception of Unity and Ventriloquism Aftereffect: Identification of Common Sources of Audiovisual Stimuli Yoshiyuki Sato yoshi@sat.t.u-tokyo.ac.jp

More information

Theoretical Neuroscience: The Binding Problem Jan Scholz, , University of Osnabrück

Theoretical Neuroscience: The Binding Problem Jan Scholz, , University of Osnabrück The Binding Problem This lecture is based on following articles: Adina L. Roskies: The Binding Problem; Neuron 1999 24: 7 Charles M. Gray: The Temporal Correlation Hypothesis of Visual Feature Integration:

More information

Neuroscience Letters

Neuroscience Letters Neuroscience Letters 450 (2009) 60 64 Contents lists available at ScienceDirect Neuroscience Letters journal homepage: www. elsevier. com/ locate/ neulet Poke and pop: Tactile visual synchrony increases

More information

Tilburg University. The spatial constraint in intersensory pairing Vroomen, Jean; Keetels, Mirjam

Tilburg University. The spatial constraint in intersensory pairing Vroomen, Jean; Keetels, Mirjam Tilburg University The spatial constraint in intersensory pairing Vroomen, Jean; Keetels, Mirjam Published in: Journal of Experimental Psychology. Human Perception and Performance Document version: Publisher's

More information

Sensory Cue Integration

Sensory Cue Integration Sensory Cue Integration Summary by Byoung-Hee Kim Computer Science and Engineering (CSE) http://bi.snu.ac.kr/ Presentation Guideline Quiz on the gist of the chapter (5 min) Presenters: prepare one main

More information

Psychophysical Methods

Psychophysical Methods Psychophysical Methods First Lesson Monica Gori Course Outline What is Psychophysic? History and example experiments Concept of threshold Absolute Threshold + examples Differential Threshold + examples

More information

(Visual) Attention. October 3, PSY Visual Attention 1

(Visual) Attention. October 3, PSY Visual Attention 1 (Visual) Attention Perception and awareness of a visual object seems to involve attending to the object. Do we have to attend to an object to perceive it? Some tasks seem to proceed with little or no attention

More information

Attention Response Functions: Characterizing Brain Areas Using fmri Activation during Parametric Variations of Attentional Load

Attention Response Functions: Characterizing Brain Areas Using fmri Activation during Parametric Variations of Attentional Load Attention Response Functions: Characterizing Brain Areas Using fmri Activation during Parametric Variations of Attentional Load Intro Examine attention response functions Compare an attention-demanding

More information

July 2014-present Postdoctoral Fellowship, in the Department of Experimental Psychology,

July 2014-present Postdoctoral Fellowship, in the Department of Experimental Psychology, Xuelian Zang Date of Birth: April 28th, 1986 Citizenship: Chinese Mobile: +49(0)159 0372 3091 Email: zangxuelian@gmail.com Address: Sennesweg 17, 85540, Haar, Munich, Germany Education July 2014-present

More information

Mutual Influences of Intermodal Visual/Tactile Apparent Motion and Auditory Motion with Uncrossed and Crossed Arms

Mutual Influences of Intermodal Visual/Tactile Apparent Motion and Auditory Motion with Uncrossed and Crossed Arms Multisensory Research 26 (2013) 19 51 brill.com/msr Mutual Influences of Intermodal Visual/Tactile Apparent Motion and Auditory Motion with Uncrossed and Crossed Arms Yushi Jiang 1,2 and Lihan Chen 1,3,

More information

Seeing Sound: Changing Visual Perception Through Cross-Modal Interaction. Tracey D. Berger. New York University. Department of Psychology

Seeing Sound: Changing Visual Perception Through Cross-Modal Interaction. Tracey D. Berger. New York University. Department of Psychology Cross-Modal Effects on Perception 1 Seeing Sound: Changing Visual Perception Through Cross-Modal Interaction. Tracey D. Berger New York University Department of Psychology Faculty Sponsor : Denis Pelli

More information

Introduction to Computational Neuroscience

Introduction to Computational Neuroscience Introduction to Computational Neuroscience Lecture 11: Attention & Decision making Lesson Title 1 Introduction 2 Structure and Function of the NS 3 Windows to the Brain 4 Data analysis 5 Data analysis

More information

Some methodological aspects for measuring asynchrony detection in audio-visual stimuli

Some methodological aspects for measuring asynchrony detection in audio-visual stimuli Some methodological aspects for measuring asynchrony detection in audio-visual stimuli Pacs Reference: 43.66.Mk, 43.66.Lj Van de Par, Steven ; Kohlrausch, Armin,2 ; and Juola, James F. 3 ) Philips Research

More information

Report. Direction of Visual Apparent Motion Driven Solely by Timing of a Static Sound. Elliot Freeman 1,2, * and Jon Driver 2 1

Report. Direction of Visual Apparent Motion Driven Solely by Timing of a Static Sound. Elliot Freeman 1,2, * and Jon Driver 2 1 Current Biology 18, 1262 1266, August 26, 2008 ª2008 Elsevier Ltd All rights reserved DOI 10.1016/j.cub.2008.07.066 Direction of Visual Apparent Motion Driven Solely by Timing of a Static Sound Report

More information

Visual stream segregation has been proposed as a method to measure visual

Visual stream segregation has been proposed as a method to measure visual Dyslexia and the assessment of visual attention. Bernt C Skottun Ullevålsalleen 4C, 0852 Oslo, Norway John R Skoyles Centre for Mathematics and Physics in the Life Sciences and Experimental Biology (CoMPLEX),

More information

Auditory scene analysis in humans: Implications for computational implementations.

Auditory scene analysis in humans: Implications for computational implementations. Auditory scene analysis in humans: Implications for computational implementations. Albert S. Bregman McGill University Introduction. The scene analysis problem. Two dimensions of grouping. Recognition

More information

SDT Clarifications 1. 1Colorado State University 2 University of Toronto 3 EurekaFacts LLC 4 University of California San Diego

SDT Clarifications 1. 1Colorado State University 2 University of Toronto 3 EurekaFacts LLC 4 University of California San Diego SDT Clarifications 1 Further clarifying signal detection theoretic interpretations of the Müller Lyer and sound induced flash illusions Jessica K. Witt 1*, J. Eric T. Taylor 2, Mila Sugovic 3, John T.

More information

Supervised Calibration Relies on the Multisensory Percept

Supervised Calibration Relies on the Multisensory Percept Article Supervised Calibration Relies on the Multisensory Percept Adam Zaidel, 1, * Wei Ji Ma, 1,2 and Dora E. Angelaki 1 1 Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA

More information

Supplemental Information: Task-specific transfer of perceptual learning across sensory modalities

Supplemental Information: Task-specific transfer of perceptual learning across sensory modalities Supplemental Information: Task-specific transfer of perceptual learning across sensory modalities David P. McGovern, Andrew T. Astle, Sarah L. Clavin and Fiona N. Newell Figure S1: Group-averaged learning

More information

Comparing Bayesian models for multisensory cue combination without mandatory integration

Comparing Bayesian models for multisensory cue combination without mandatory integration Comparing Bayesian models for multisensory cue combination without mandatory integration Ulrik R. Beierholm Computation and Neural Systems California Institute of Technology Pasadena, CA 9105 beierh@caltech.edu

More information

A Race Model of Perceptual Forced Choice Reaction Time

A Race Model of Perceptual Forced Choice Reaction Time A Race Model of Perceptual Forced Choice Reaction Time David E. Huber (dhuber@psyc.umd.edu) Department of Psychology, 1147 Biology/Psychology Building College Park, MD 2742 USA Denis Cousineau (Denis.Cousineau@UMontreal.CA)

More information

Lecturer: Rob van der Willigen 11/9/08

Lecturer: Rob van der Willigen 11/9/08 Auditory Perception - Detection versus Discrimination - Localization versus Discrimination - - Electrophysiological Measurements Psychophysical Measurements Three Approaches to Researching Audition physiology

More information

Addressing Conflicts in Sensory Dominance Research

Addressing Conflicts in Sensory Dominance Research Addressing Conflicts in Sensory Dominance Research A THESIS SUBMITTED TO THE GRADUATE DIVISION OF THE UNIVERSTY OF HAWAII AT MANOA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF

More information

Attention enhances feature integration

Attention enhances feature integration Vision Research 43 (2003) 1793 1798 Rapid Communication Attention enhances feature integration www.elsevier.com/locate/visres Liza Paul *, Philippe G. Schyns Department of Psychology, University of Glasgow,

More information

Lecturer: Rob van der Willigen 11/9/08

Lecturer: Rob van der Willigen 11/9/08 Auditory Perception - Detection versus Discrimination - Localization versus Discrimination - Electrophysiological Measurements - Psychophysical Measurements 1 Three Approaches to Researching Audition physiology

More information

Cross-modal facilitation of visual and tactile motion

Cross-modal facilitation of visual and tactile motion Submitted to Nature Neuroscience 8/2/2008 Cross-modal facilitation of visual and tactile motion Monica Gori, Giulio Sandini and David C. Burr 2,3. Istituto Italiano di Tecnologia, via Morego 30, 663 Genoa,

More information

When do auditory/visual differences in duration judgements occur?

When do auditory/visual differences in duration judgements occur? THE QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY 2006, 59 (10), 1709 1724 When do auditory/visual differences in duration judgements occur? J. H. Wearden Keele University, Keele, UK N. P. M. Todd and L.

More information

Visual dominance and attention: The Colavita effect revisited

Visual dominance and attention: The Colavita effect revisited Perception & Psychophysics 2007, 69 (5), 673-686 Visual dominance and attention: The Colavita effect revisited SCOTT SINNETT Universitat de Barcelona, Barcelona, Spain CHARLES SPENCE University of Oxford,

More information

2012 Course: The Statistician Brain: the Bayesian Revolution in Cognitive Sciences

2012 Course: The Statistician Brain: the Bayesian Revolution in Cognitive Sciences 2012 Course: The Statistician Brain: the Bayesian Revolution in Cognitive Sciences Stanislas Dehaene Chair of Experimental Cognitive Psychology Lecture n 5 Bayesian Decision-Making Lecture material translated

More information

Multimodal Driver Displays: Potential and Limitations. Ioannis Politis

Multimodal Driver Displays: Potential and Limitations. Ioannis Politis Multimodal Driver Displays: Potential and Limitations Ioannis Politis About me (http://yannispolitis.info/hci/) Background: B.Sc. Informatics & Telecommunications University of Athens M.Sc. Advanced Information

More information

The Effects of Action on Perception. Andriana Tesoro. California State University, Long Beach

The Effects of Action on Perception. Andriana Tesoro. California State University, Long Beach ACTION ON PERCEPTION 1 The Effects of Action on Perception Andriana Tesoro California State University, Long Beach ACTION ON PERCEPTION 2 The Effects of Action on Perception Perception is a process that

More information

Sensory Adaptation within a Bayesian Framework for Perception

Sensory Adaptation within a Bayesian Framework for Perception presented at: NIPS-05, Vancouver BC Canada, Dec 2005. published in: Advances in Neural Information Processing Systems eds. Y. Weiss, B. Schölkopf, and J. Platt volume 18, pages 1291-1298, May 2006 MIT

More information

The role of modality congruence in the presentation and recognition of taskirrelevant stimuli in dual task paradigms.

The role of modality congruence in the presentation and recognition of taskirrelevant stimuli in dual task paradigms. The role of modality congruence in the presentation and recognition of taskirrelevant stimuli in dual task paradigms. Maegen Walker (maegenw@hawaii.edu) Department of Psychology, University of Hawaii at

More information

Crossmodal influences on visual perception

Crossmodal influences on visual perception JID:PLREV AID:99 /REV [m3sc+; v 1.120; Prn:4/05/2010; 9:02] P.1 (1-16) Physics of Life Reviews ( ) Review Crossmodal influences on visual perception Ladan Shams a,b,c,, Robyn Kim a a Department of Psychology,

More information

Visual Selection and Attention

Visual Selection and Attention Visual Selection and Attention Retrieve Information Select what to observe No time to focus on every object Overt Selections Performed by eye movements Covert Selections Performed by visual attention 2

More information

Perception of Synchrony between the Senses

Perception of Synchrony between the Senses 9 Perception of Synchrony between the Senses Mirjam Keetels and Jean Vroomen Contents 9.1 Introduction... 147 9.2 Measuring Intersensory Synchrony: Temporal Order Judgment Task and Simultaneity Judgment

More information

Auditory Scene Analysis

Auditory Scene Analysis 1 Auditory Scene Analysis Albert S. Bregman Department of Psychology McGill University 1205 Docteur Penfield Avenue Montreal, QC Canada H3A 1B1 E-mail: bregman@hebb.psych.mcgill.ca To appear in N.J. Smelzer

More information

Changing expectations about speed alters perceived motion direction

Changing expectations about speed alters perceived motion direction Current Biology, in press Supplemental Information: Changing expectations about speed alters perceived motion direction Grigorios Sotiropoulos, Aaron R. Seitz, and Peggy Seriès Supplemental Data Detailed

More information

Bayesian integration of visual and auditory signals for spatial localization

Bayesian integration of visual and auditory signals for spatial localization Battaglia et al. Vol. 20, No. 7/July 2003/J. Opt. Soc. Am. A 1391 Bayesian integration of visual and auditory signals for spatial localization Peter W. Battaglia, Robert A. Jacobs, and Richard N. Aslin

More information

Hierarchical Bayesian Modeling of Individual Differences in Texture Discrimination

Hierarchical Bayesian Modeling of Individual Differences in Texture Discrimination Hierarchical Bayesian Modeling of Individual Differences in Texture Discrimination Timothy N. Rubin (trubin@uci.edu) Michael D. Lee (mdlee@uci.edu) Charles F. Chubb (cchubb@uci.edu) Department of Cognitive

More information

Magnitude Estimation Reveals Temporal Binding at Super-Second Intervals

Magnitude Estimation Reveals Temporal Binding at Super-Second Intervals Journal of Experimental Psychology: Human Perception and Performance 29, Vol. 35, No. 5, 1542 1549 29 American Psychological Association 96-1523/9/$12. DOI: 1.137/a14492 Magnitude Estimation Reveals Temporal

More information

Visual and Auditory Velocity Perception and Multimodal Illusions. Katherine S. Gasaway. Advisor: Paul M. Corballis, PhD

Visual and Auditory Velocity Perception and Multimodal Illusions. Katherine S. Gasaway. Advisor: Paul M. Corballis, PhD 1 Running Head: VISUAL AND AUDITORY VELOCITY Visual and Auditory Velocity Perception and Multimodal Illusions Katherine S. Gasaway Advisor: Paul M. Corballis, PhD Reviewers: Paul M. Corballis, PhD, Eric

More information

Existence of competing modality dominances

Existence of competing modality dominances DOI 10.3758/s13414-016-1061-3 Existence of competing modality dominances Christopher W. Robinson 1 & Marvin Chandra 2 & Scott Sinnett 2 # The Psychonomic Society, Inc. 2016 Abstract Approximately 40 years

More information

Tactile Communication of Speech

Tactile Communication of Speech Tactile Communication of Speech RLE Group Sensory Communication Group Sponsor National Institutes of Health/National Institute on Deafness and Other Communication Disorders Grant 2 R01 DC00126, Grant 1

More information

Auditory dominance over vision in the perception of interval duration

Auditory dominance over vision in the perception of interval duration Exp Brain Res (29) 198:49 57 DOI 1.17/s221-9-1933-z RESEARCH ARTICLE Auditory dominance over vision in the perception of interval duration David Burr Æ Martin S. Banks Æ Maria Concetta Morrone Received:

More information

Running head: EFFECTS OF EMOTION ON TIME AND NUMBER 1. Fewer Things, Lasting Longer: The Effects of Emotion on Quantity Judgments

Running head: EFFECTS OF EMOTION ON TIME AND NUMBER 1. Fewer Things, Lasting Longer: The Effects of Emotion on Quantity Judgments Running head: EFFECTS OF EMOTION ON TIME AND NUMBER 1 In Press in Psychological Science Fewer Things, Lasting Longer: The Effects of Emotion on Quantity Judgments Laura N. Young*, Department of Psychology,

More information

PAPER Children do not recalibrate motor-sensory temporal order after exposure to delayed sensory feedback

PAPER Children do not recalibrate motor-sensory temporal order after exposure to delayed sensory feedback Developmental Science (214), pp 1 1 DOI: 1.1111/desc.12247 PAPER Children do not recalibrate motor-sensory temporal order after exposure to delayed sensory feedback Tiziana Vercillo, 1 David Burr, 2,3

More information

A Race Model of Perceptual Forced Choice Reaction Time

A Race Model of Perceptual Forced Choice Reaction Time A Race Model of Perceptual Forced Choice Reaction Time David E. Huber (dhuber@psych.colorado.edu) Department of Psychology, 1147 Biology/Psychology Building College Park, MD 2742 USA Denis Cousineau (Denis.Cousineau@UMontreal.CA)

More information

Categorical Perception

Categorical Perception Categorical Perception Discrimination for some speech contrasts is poor within phonetic categories and good between categories. Unusual, not found for most perceptual contrasts. Influenced by task, expectations,

More information

Supplementary materials for: Executive control processes underlying multi- item working memory

Supplementary materials for: Executive control processes underlying multi- item working memory Supplementary materials for: Executive control processes underlying multi- item working memory Antonio H. Lara & Jonathan D. Wallis Supplementary Figure 1 Supplementary Figure 1. Behavioral measures of

More information

Auditory modulation of visual apparent motion with short spatial and temporal intervals

Auditory modulation of visual apparent motion with short spatial and temporal intervals Journal of Vision (2010) 10(12):31, 1 13 http://www.journalofvision.org/content/10/12/31 1 Auditory modulation of visual apparent motion with short spatial and temporal intervals Hulusi Kafaligonul Gene

More information

Exploring a brightness-drag illusion. Author. Published. Journal Title DOI. Copyright Statement. Downloaded from. Griffith Research Online

Exploring a brightness-drag illusion. Author. Published. Journal Title DOI. Copyright Statement. Downloaded from. Griffith Research Online Exploring a brightness-drag illusion Author Habota, Tina, Chappell, Mark Published 2011 Journal Title Perception DOI https://doi.org/10.1068/p6767 Copyright Statement 2011 Pion Ltd., London. The attached

More information

2012 Course : The Statistician Brain: the Bayesian Revolution in Cognitive Science

2012 Course : The Statistician Brain: the Bayesian Revolution in Cognitive Science 2012 Course : The Statistician Brain: the Bayesian Revolution in Cognitive Science Stanislas Dehaene Chair in Experimental Cognitive Psychology Lecture No. 4 Constraints combination and selection of a

More information

Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli

Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli Psychon Bull Rev (2011) 18:123 128 DOI 10.3758/s13423-010-0027-z Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli Shoko Kanaya & Kazuhiko Yokosawa Published

More information

Representational Momentum Beyond Internalized Physics

Representational Momentum Beyond Internalized Physics CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE Representational Momentum Beyond Internalized Physics Embodied Mechanisms of Anticipation Cause Errors in Visual Short-Term Memory Dirk Kerzel University of

More information

TEMPORAL ORDER JUDGEMENTS A SENSITIVE MEASURE FOR MEASURING PERCEPTUAL LATENCY?

TEMPORAL ORDER JUDGEMENTS A SENSITIVE MEASURE FOR MEASURING PERCEPTUAL LATENCY? TEMPORAL ORDER JUDGEMENTS A SENSITIVE MEASURE FOR MEASURING PERCEPTUAL LATENCY? Katharina Weiß and Ingrid Scharlau Department of Cultural Sciences, Paderborn University, 33098 Paderborn, Germany katharina.weiss@uni-paderborn.de,

More information

Supporting Information

Supporting Information 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 Supporting Information Variances and biases of absolute distributions were larger in the 2-line

More information

Bayesian integration in sensorimotor learning

Bayesian integration in sensorimotor learning Bayesian integration in sensorimotor learning Introduction Learning new motor skills Variability in sensors and task Tennis: Velocity of ball Not all are equally probable over time Increased uncertainty:

More information

Coexistence of Multiple Modal Dominances

Coexistence of Multiple Modal Dominances Coexistence of Multiple Modal Dominances Marvin Chandra (mchandra@hawaii.edu) Department of Psychology, University of Hawaii at Manoa 2530 Dole Street, Honolulu, HI 96822, USA Christopher W. Robinson (robinson.777@osu.edu)

More information

The influence of regularities on temporal order judgments

The influence of regularities on temporal order judgments The influence of regularities on temporal order judgments Sanne Haverkort July 2014 Bachelor research project for Liberal Arts and Sciences Major: Cognitive and Neurobiological Psychology Author: Sanne

More information

Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 1. The attentional window configures to object boundaries. University of Iowa

Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 1. The attentional window configures to object boundaries. University of Iowa Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 1 The attentional window configures to object boundaries University of Iowa Running head: PERCEPTUAL GROUPING AND SPATIAL SELECTION 2 ABSTRACT When

More information

Virtual Reality Testing of Multi-Modal Integration in Schizophrenic Patients

Virtual Reality Testing of Multi-Modal Integration in Schizophrenic Patients Virtual Reality Testing of Multi-Modal Integration in Schizophrenic Patients Anna SORKIN¹, Avi PELED 2, Daphna WEINSHALL¹ 1 Interdisciplinary Center for Neural Computation, Hebrew University of Jerusalem,

More information

Offsets and prioritizing the selection of new elements in search displays: More evidence for attentional capture in the preview effect

Offsets and prioritizing the selection of new elements in search displays: More evidence for attentional capture in the preview effect VISUAL COGNITION, 2007, 15 (2), 133148 Offsets and prioritizing the selection of new elements in search displays: More evidence for attentional capture in the preview effect Jay Pratt University of Toronto,

More information

Dynamic and predictive links between touch and vision

Dynamic and predictive links between touch and vision Exp Brain Res (2002) 145:50 55 DOI 10.1007/s00221-002-1085-x RESEARCH ARTICLE Rob Gray Hong Z. Tan Dynamic and predictive links between touch and vision Received: 21 August 2001 / Accepted: 25 February

More information

Recalibrating the body: visuotactile ventriloquism aftereffect

Recalibrating the body: visuotactile ventriloquism aftereffect Recalibrating the body: visuotactile ventriloquism aftereffect Majed Samad 1 and Ladan Shams 1,2 1 Department of Psychology, University of California, Los Angeles, CA, United States of America 2 Department

More information

The Plasticity of Temporal Perception: Perceptual Training Enhances Multisensory Temporal. Acuity. Matthew Allen De Niear.

The Plasticity of Temporal Perception: Perceptual Training Enhances Multisensory Temporal. Acuity. Matthew Allen De Niear. The Plasticity of Temporal Perception: Perceptual Training Enhances Multisensory Temporal Acuity By Matthew Allen De Niear Dissertation Submitted to the Faculty of the Graduate School of Vanderbilt University

More information

Ch.20 Dynamic Cue Combination in Distributional Population Code Networks. Ka Yeon Kim Biopsychology

Ch.20 Dynamic Cue Combination in Distributional Population Code Networks. Ka Yeon Kim Biopsychology Ch.20 Dynamic Cue Combination in Distributional Population Code Networks Ka Yeon Kim Biopsychology Applying the coding scheme to dynamic cue combination (Experiment, Kording&Wolpert,2004) Dynamic sensorymotor

More information

Thesis. Attentional modulation of frequency specific distortions of event time. Utrecht University

Thesis. Attentional modulation of frequency specific distortions of event time. Utrecht University Utrecht University Master Psychology, Toegepaste Cognitieve Psychologie. Thesis Attentional modulation of frequency specific distortions of event time Pascal Fruneaux, 3269264 26-06- 2012 Supervisors:

More information

Simultaneity constancy

Simultaneity constancy Perception, 24, volume 33, pages 149 ^ 16 DOI:1.168/p5169 Simultaneity constancy Agnieszka Kopinska, Laurence R Harris Centre for Vision Research, York University, Toronto, Ontario M3J 1P3, Canada; e-mail:

More information

Cultural Differences in Cognitive Processing Style: Evidence from Eye Movements During Scene Processing

Cultural Differences in Cognitive Processing Style: Evidence from Eye Movements During Scene Processing Cultural Differences in Cognitive Processing Style: Evidence from Eye Movements During Scene Processing Zihui Lu (zihui.lu@utoronto.ca) Meredyth Daneman (daneman@psych.utoronto.ca) Eyal M. Reingold (reingold@psych.utoronto.ca)

More information

The influence of visual motion on fast reaching movements to a stationary object

The influence of visual motion on fast reaching movements to a stationary object Supplemental materials for: The influence of visual motion on fast reaching movements to a stationary object David Whitney*, David A. Westwood, & Melvyn A. Goodale* *Group on Action and Perception, The

More information

Visual motion influences the contingent auditory motion aftereffect Vroomen, Jean; de Gelder, Beatrice

Visual motion influences the contingent auditory motion aftereffect Vroomen, Jean; de Gelder, Beatrice Tilburg University Visual motion influences the contingent auditory motion aftereffect Vroomen, Jean; de Gelder, Beatrice Published in: Psychological Science Publication date: 2003 Link to publication

More information

Time flies when we read taboo words

Time flies when we read taboo words Psychonomic Bulletin & Review 2010, 17 (4), 563-568 doi:10.3758/pbr.17.4.563 Time flies when we read taboo words JASON TIPPLES University of Hull, Hull, England Does time fly or stand still when one is

More information

Single cell tuning curves vs population response. Encoding: Summary. Overview of the visual cortex. Overview of the visual cortex

Single cell tuning curves vs population response. Encoding: Summary. Overview of the visual cortex. Overview of the visual cortex Encoding: Summary Spikes are the important signals in the brain. What is still debated is the code: number of spikes, exact spike timing, temporal relationship between neurons activities? Single cell tuning

More information

ARTICLE IN PRESS. Vision Research xxx (2008) xxx xxx. Contents lists available at ScienceDirect. Vision Research

ARTICLE IN PRESS. Vision Research xxx (2008) xxx xxx. Contents lists available at ScienceDirect. Vision Research Vision Research xxx (2008) xxx xxx Contents lists available at ScienceDirect Vision Research journal homepage: www.elsevier.com/locate/visres Intertrial target-feature changes do not lead to more distraction

More information

THE PRESENTATION OF LONG TERM DURATION OF BODY MOVEMENT IN IMPRESSIONIST ARTWORKS DIFFERENTLY DISTORT THE PERCEPTION OF TIME

THE PRESENTATION OF LONG TERM DURATION OF BODY MOVEMENT IN IMPRESSIONIST ARTWORKS DIFFERENTLY DISTORT THE PERCEPTION OF TIME THE PRESENTATION OF LONG TERM DURATION OF BODY MOVEMENT IN IMPRESSIONIST ARTWORKS DIFFERENTLY DISTORT THE PERCEPTION OF TIME Francisco Carlos Nather and José Lino Oliveira Bueno Department of Psychology,

More information