Body Schema Modulation & Tool Embodiment in Real & Virtual Environments

Size: px
Start display at page:

Download "Body Schema Modulation & Tool Embodiment in Real & Virtual Environments"

Transcription

1 Body Schema Modulation & Tool Embodiment in Real & Virtual Environments by Kimberley Jovanov A thesis submitted in conformity with the requirements for the degree of Master of Science Exercise Science University of Toronto Copyright by Kimberley Jovanov 2016

2 ii Body Schema Modulation & Tool Embodiment in Real & Virtual Environments Abstract Kimberley Jovanov Master of Science Exercise Science University of Toronto 2016 Tool embodiment occurs when the neural representation of the body (i.e., body schema) incorporates the representation of a tool following interaction with the tool. The present thesis examined if tool embodiment can occur through different mediums of tool interactions. Participants completed a response time task (sensitive to tool embodiment) before and after interacting with one of three tools. In the physical tool interaction, participants held a real plastic rake to move a ball around a course. In two virtual tool interactions, an avatar holding a virtual rake was projected on to a screen and participants used a keyboard or a Wii remote to move the virtual rake. Response time results indicate that prior to interaction, the rake was processed as separate from the body. However, through real and virtual tool experience, the rake was embodied such that it was incorporated into the body schema and processed as an extension of the hand.

3 iii Acknowledgments Thank you to everyone who has encouraged and supported me throughout my academic journey. To all of my committee members; thank you for your thoughtful contributions to this thesis. First, to my supervisor and mentor, Dr. Tim Welsh, thank you for giving me the opportunities and confidence to explore my passions and providing unwavering support and guidance throughout my time in the lab. To my committee member and professor, Dr. Luc Tremblay, thank you for pushing me to challenge the accepted wisdom and go beyond the limits I thought I had. To my committee member, Dr. Ali Mazalek, thank you to you and your lab members for providing the technical expertise and support to allow this thesis project to happen. To Dr. Lawrence Grierson, thank you for your helpful comments and criticisms that helped shaped the final document. To all of my lab-mates in the AA and PMB lab, thank you for all of your advice, honesty and friendship over the past few years. Lastly, thank you to my parents and family, Alex, and Sam for your encouragement, support, endless laughter and all of your love. I am truly grateful for the people I have had the privilege of working with throughout my degree, who have inspired and prepared me to explore future professional and personal opportunities.

4 iv Table of Contents Acknowledgments...iii Table of Contents... iv List of Tables. vi List of Figures...vii Introduction....1 Chapter 1: Literature Review Body and Space Representation Dynamic Properies of Space Representation Neurophysiological Evidence Behavioural Evidence Tool Embodiment in a Virtual Enviornment The Current Project: Purpose and Specific Hypotheses Predictions Chapter 2: Methods Participants Apparatus, Tasks, and Procedures Pre-/Post-Training RT Task Completed in Phases 1 and Rake Interactions Completed in Phase Physical Tool Interaction Group Virtual-Immersive Tool Interaction Group Virtual-Keyboard Tool Interaction Group Chapter 3: Results Data Reduction and Analysis Body-Part Compatibility Effect

5 v 3.3 Effects of Tool-use on Hand RTs The Relationship between Video Game Experience and Changes in RTs to the Rake...40 Chapter 4: Discussion Summary and Overview Tool Embodiment in the Different Training Conditions The Relationship between Tool Embodiment and Video Game Experience Limitations Conclusions and Future Directions References

6 vi List of Tables Table 1. Mean and standard deviations of the number of response errors 41

7 vii List of Figures Fig. 1 Mean response time in milliseconds (ms) as a function of Responding Limb and Target Location. Standard error of the mean (S.E.M.) bars are shown. The data depicted here are from Jovanov et al. (2015). This pattern of RTs is a demonstration of the basic body-part compatibility effect. A similar pattern of findings is anticipated in the present work Fig. 2 Mean response time in ms as a function of Time and Target Location for hand responses. S.E.M. bars are shown. The data depicted here are from Jovanov et al. (2015). A similar pattern of findings is anticipated in the present work if tool embodiment can occur for a virtual tool Fig. 3 Mean response time in ms as a function of Time and Target Location for hand responses. S.E.M. bars are shown. The data depicted here are adapted from Jovanov et al. (2015). A similar pattern of findings is anticipated in the present work if tool embodiment cannot occur for a virtual tool.. 24 Fig. 4 Examples of the target displays with the target presented on: a) the foot; b) the hand; and c) the rake Fig. 5 Image of an exemplar participant completing the training in the physical tool group. Participants controlled the rake to move the ball around the course...29 Fig. 6 Images of the tool interaction system for the virtual-immersive group (left panel) and of the virtual environment for the tool interaction (right panel). Movements of the tool interaction system under a tarp were accurately translated to the projected movements of the arm and rake in the virtual environment Fig. 7 Image of and exemplar participant completing the training in the virtual-keyboard group. Participants pressed the buttons on the number pad of the keyboard to move the rake and ball around the virtual course...32 Fig. 8 Mean response time in ms as a function of Responding Limb and Target Location. 95% Confidence Interval bars are shown

8 viii Fig. 9 Mean response time in ms as a function of Time and Target Location for hand responses. 95% Confidence Interval bars are shown Fig. 10 Mean response time in ms as a function of Time and Target Location for hand responses for a) physical b) virtual-immersive and c) virtual-keyboard tool conditions. 95% Confidence Interval bars are shown Fig. 11 Mean difference between hand response times to targets presented on the rake, pre and post tool interaction, in ms, as a function of previous video game experience for a) physical, b) virtual-immersive, and c) virtual-keyboard tool conditions... 42

9 1 Introduction The human brain is continuously receiving input from a variety of stimuli including the sources of information from the external environment as well as those generated within the body. The brain receives, condenses and integrates all of this information and generates an output to act on these stimuli through simple and complex actions. This action output consequently produces further sensory stimuli that then become additional incoming information for the brain. Thus, sensation, perception, and action create a closed loop cycle. The fact that the brain interprets and efficiently acts on the vast amount of sensory stimuli is astounding and has been a critical interest of investigation throughout the field of motor control, neuroscience and psychology. A large subset of our actions involves interacting with our physical environment and, through evolution, we have developed the ability to manipulate physical objects to achieve action goals. Whether we are drinking a pint of beer or shoveling the snow, we manipulate a variety of everyday specific-use objects (i.e., tools such as knives, keys, and brooms) that provide an advantage for achieving our goals. How we plan our greater action goals through the manipulation of non-human objects is of critical scientific interest. Tools often provide a physical extension of our body or a functional adaptation that allow certain tasks to become possible (e.g., laproscopic cameras) or more efficient (e.g., a shovel) above and beyond the capabilities of our actual physical body. In this way, we use external objects as if they were an extension of our physical body to perform actions that we would not otherwise be able to perform. Although the processes enabling tool-use are important for human endeavors, how the brain incorporates these tools into the motor planning process has not been fully understood. The mechanisms behind the tool-use phenomenon are critical for contributing understanding to human motor control, as well as pose a potential for improving our efficiency for action. For example, modern society has become heavily reliant on technology and, specifically, we have developed the ability to manipulate non-physical objects in a virtual environment (e.g., computer). Modern medicine and engineering, amongst others, relies on this technology to advance their contributions and thus we should understand the mechanisms underlying tool-use for this technology to be designed and used most effectively. The research reported in the present

10 2 thesis was specifically designed to investigate human motor control of physical and virtual tools to understand the underlying mechanisms by which these efficient actions occur. Prior to outlining the specific hypotheses of the thesis experiment, Chapter 1 will outline and review current literature on body and space representation, as well as tool embodiment in physical and virtual environments. Chapter 2 then presents the current experimental design and methodology. The presentation of the methodology is followed by a report of the data analysis and results in Chapter 3. Finally, Chapter 4 presents discussions of the data, limitations, future directions and general conclusions.

11 3 Chapter 1 Literature Review 1.1 Body and Space Representation Human beings are in a near perpetual state of action; exhibiting constant reciprocal interaction with physical and social environments. From an egocentric perspective, an individual s brain receives multisensory information from the environment as well as stimuli generated within the body. These stimuli are integrated and used to prepare and control voluntary action and behaviour. As a consequence of systematic interactions between visual, tactile, proprioceptive and vestibular inputs, neural network activity becomes gradually refined and organized to create a mental construct that comprises the sensory impressions, perceptions and ideas about the dynamic organization of one s own body and its relation to that of other bodies and the rest of the world (Berlucchi & Aglioti, 1997). This concept of a neural representative map of our body was originally divided into two components; a mainly unconscious, motorbased representation called the body schema, and a more conscious, perceptually-based representation, termed body image (Head & Holmes, 1911). Researchers in various fields have debated the characteristics and mechanisms encompassing body and space representation definitions. Specifically, the action-based body schema representation, as it relates to the field of motor control, has become an intensive area of interest throughout neurophysiological, neuropsychological and neurobehavioral research. This body schema, and in particular plastic use-dependent changes in the body schema, is the focus of the present thesis. It has been theorized that our brain represents personal space (space incorporating the body) and peri-personal space (space immediately surrounding our body) differently than extrapersonal space (far space from the body) (Rizzolatti, Fadiga, Gallese, & Fogassi, 1997). Space representation of the body and its peri-personal regions is more thorough, complex and has a larger impact on our sensory system, which can help us distinguish the boundaries of our body and the space that is reachable by our body (Craig & Rollman, 1999). Neurophysiological evidence supports the idea that our body schema involves an integrated neural representation of visual, somatosensory and auditory information that is constantly updated to monitor the position

12 4 and movement of the body in relation to nearby objects (e.g., Holmes & Spence, 2004). Rather than a central processing centre, body representation has been identified as an interacting network of cortical and subcortical structures including the pre-motor cortex, posterior parietal cortex, extrastriate body area, fusiform body area and the somatosensory cortex (Graziano, 1994). Somatotopic organization of the primary somatosensory cortex and pre-motor cortex allows incoming sensory signals from a specific location of the body to be processed by a corresponding cortical area to convey information about the body part and the space immediately surrounding it (Graziano, 1994). The primary somatosensory cortex is critical for receiving and refining somatosensory information which can then be integrated with other cortical structures' sensory modalities, such as visual information (Graziano, 1999). Also, the ventral aspect of the pre-motor cortex (PMv) borders the primary motor cortex and contains somatotopic maps of the arms, hands and face (Holmes & Spence, 2004). Neural recordings of this area have identified a subset of neurons that respond both to somatosensory and visual stimuli which is consistent with the bi-modal neuron hypothesis supported in single neuron recordings in monkeys (Graziano, 1999; Fogassi, Gallese, Fadiga, Luppino, Matelli & Rizzolatti, 1996). It has been suggested that the PMv integrates multisensory information to create a representation of peri-personal space, with recent evidence suggesting additional tri-modal neurons in humans that respond to auditory information along with somatosensory and visual information regarding stimulus location relative to the body (Graziano, 1999). Critically, PMv activity continues to respond when an object or limb is no longer seen, suggesting that this area holds a neural representation of the body that is continually modulated by incoming sensory information (Graziano, Hu, & Gross, 1997). Additionally, the posterior parietal cortex contains neurons with similar properties to the ventral pre-motor cortex in that it receives somatosensory input from the primary somatosensory and visual input from the visual cortex (Graziano & Gross, 1995). This information is integrated in the region that lies posterior to the central somatosensory cortex and is responsible for encoding posture and limb movement. This area has been proposed to have a significant role in the planning and execution of movement, as well as providing a dynamic update of the body schema on a movement to movement basis (Graziano, Cooke, & Taylor, 2000).

13 5 Lastly, there is evidence from functional magnetic resonance imaging (fmri) analysis of activity-related changes in blood flow in the human brain to suggest that body-selective brain regions are found bilaterally in the posterior inferior temporal sulcus and middle temporal gyrus region. These areas have been labelled the extrastriate body area (EBA: see Downing, Jiang, Shuman, & Kanwisher, 2001). These blood flow changes, and thus assumed associated neural responses, are strong and selective when an individual views images of bodies and body parts, relative to when viewing faces and objects. Interestingly, these responses were generalized over life-like pictures, static line drawings, stick figures and silhouette pictorial representations of the body, suggesting that the schema representation is abstract across visual features of the body (Downing et al., 2001). Further, recent evidence supports the somatotopic nature of EBA organization, with selective regional responses to hand versus foot regions of the body (Downing, Chan, Peelen, Dodds, & Kanwisher, 2006). The EBA receives information that updates body representations after movement due to novel coupling between visual and motor representations. This coupling has been observed via increased EBA activation during and following the execution of motor actions (Astafiey, Stanley, Shulman, & Corbetta, 2004). Movement execution might affect the body s representation through related proprioceptive inputs or perhaps through corollary discharge signals from nearby motor areas. Similar to the parietal cortex, EBA integration of internal action signals with external visual stimuli could serve to distinguish between one's own and another s body parts (Jeannerod, 2004). In addition, lesion or functional disruptions to this body-selective cortical area is characterized by specific deficits including reduced self-identification and spatial neglect (see Peelen, & Downing, 2007, for review). Overall, it is hypothesized that our brain has developed a neural map that represents our body in space that functions to provide us with a base estimate of our body shape, size and position with respect to our environment (see Maravita and Iriki 2004 for review). This representation, in turn, allows us to generate actions to effectively interact in our environment. Research has provided evidence to indicate an action-oriented sensitivity of this body representation, which is continuously updated through the integration of multisensory stimuli from the environment and within the body itself (Holmes & Spence, 2004; Graziano, 1999; Graziano, Cooke, & Taylor, 2000). This representative network is hypothesized to exist through a series of interconnected, somatotropically organized cortical structures including the pre-motor

14 6 cortex, posterior parietal cortex, extrastriate body area, and the somatosensory cortex. This system of cortical areas allows the integration of incoming sources of information to produce a representation of the body that can be utilized in motor planning and execution (Downing et al. 2001; see Peelen and Downing 2007 for a review). 1.2 Dynamic Properties of Space Representation It has been suggested that the body schema is sub-served by a distributed neural network, and that this body schema is open to the continuous shaping influences of experience. The idea that the body schema is shaped and updated by experience has been strongly supported in literature that investigates the impact of tool interaction on the visuotactile representation of peripersonal space. Tool interaction has been suggested to expand our physical capabilities as well as demonstrate adaptive plasticity in our brain s body representation (e.g., Holmes & Spence, 2004). This work includes a wide range of neurophysiological and behavioural evidence from animal studies, as well as work with neurotypical and atypical populations. The results of this work suggests that, as a result of purposeful interaction with a tool, the body schema expands to incorporate the tool within the body schema, as if it was an extension of the physical effector (Maravita & Iriki, 2004). Modulation of the body schema following tool-use, referred to as tool embodiment, highlights the dynamic properties of the neural representation of the body in space and how critical action is in modulating our perception and future action capabilities Neurophysiological Evidence Early neurophysiological evidence supporting the action-oriented body schema representation has led researchers to investigate the mechanisms behind how these functionally specialized parietofrontal circuits are able to transform sensory representations of the body and the environment into motor plans for prehension. Prehension refers to the reaching, grasping and manipulation of objects (Johnson-Frey, 2003). Through natural observation, we can see that there is a universality of prehension between species, including our closest member in the evolutionary tree, the chimpanzee. Early theory that has now been supported by sophisticated neurophysiological evidence suggests the existence of homologous brain circuits between monkeys and humans for object prehension (Johnson-Frey, 2003; Culham & Kanwisher, 2001). This comparative work has allowed us to understand and predict the neural mechanisms

15 7 underlying human reaching, grasping and object manipulation. For example, during reaching movements toward an object, the individual is required to form a representation of the objects location and orientation while continually representing their limb position in space (Johnson- Frey, 2003). Neurophysiological measurements in monkeys during these reaching movements found interconnecting regions of the medial intraparietal sulcus (IPS) and the dorsal premotor cortex (PMd) (Johnson, Ferraina, Bianchi, & Caminiti, 1996; Andersen, & Buneo, 2002), such that the PMd receives direct visual and proprioceptive input from the superior parietal lobule (SPL), which is then integrated to plan goal-directed reaching (Hoshi & Tanji, 2000). Extending to human studies, Johnson et al. (2002) supported these previous findings when measuring neural activity during reaching in the putative homologs of the IPS and PMd. Next, during grasping movements, individuals must integrate sensory information to form a representation of the objects intrinsic spatial properties, such as shape, size and texture within the hand. During grasping movements in monkeys, area F5 in the ventral premotor cortex receives input from the anterior intraparietal area (AIP), forming a ventral parietofrontal circuit (Sakata, Taira, Kusunoki, Murata & Tanaka, 1997). This cellular activity in area F5 also exhibited large increases that were specific to different types of manual actions (Rizzolatti, Fogassi, & Gallese, 2002). This functional connectivity of the ventral parietofrontal circuit during grasping was observed in the homologous human structures (anterior IPS and inferior frontal cortex) during similar grasping movements (Binkofski, Buccino, Posse, Seitz, Rizzolatti, & Freund, 1999). Additionally, research examining the effects of lesions to these human structures revealed hand configuration deficits during grasping movements, supporting its role in grasp planning and execution (Binkofski, Dohle, Posse, Stephan, Hefter, Seitz & Freund, 1998). Additionally, it appears that object manipulation incorporates highly complex representations of the body relative to the object and the object relative to the body (Graziano & Gross, 1998). Further, a broader joint representation of the object and body must be formed in order to cooperatively complete a goal-directed action (Johnson-Frey, 2003). Critically, seminal neurophysiological studies of object manipulation (i.e., tools) using single-neuron recordings have revealed the plastic nature of space representations (Iriki, Tanaka, & Iwamura, 1996). Prior to this work, it had been postulated that the perceptual assimilation between a tool and hand following experience using the tool was a result of body schema modulation; however, the neural

16 8 mechanism was unknown (Head & Holmes, 1911; Paillard, 1993). Works investigating the body schema (reviewed in section 1.1) identified several underlying neural networks that formed the basis for investigation following tool-use (Iriki et al., 1996). Specifically, several body schema neural networks, such as the ventral intraparietal area (VIP) and intraparietal sulcus (IPS), contain a high amount of bi-modal neurons, that respond both to somatosensory and visual stimuli. As such, these bi-modal neuron cells have a somatosensory (tactile) and visual receptive field, with respect to their representative body part (e.g., hand), that activates when a tactile or visual stimulus is presented near or within the limb-specific receptive field. As the stimulus moves away from the receptive field, the cell discharge would decrease. As such, Iriki et al. (1996) emphasized that these bi-modal neurons were neural correlates of body schema-related neural networks and proposed that bi-modal neurons were the cellular components that would be modulated during the interaction with a tool. To this end, they conducted a single neuron recording study in a macaque monkey in which they recorded from bi-modal neurons in the VIP and IPS that represented the hand region of a macaque monkey. At the beginning of the study, the researchers recorded from these bimodal neurons when they presented stimuli off and on the tool. Macaque monkeys were then trained to sit on a chair, with their head in a fixed position. Food pellets were placed on a table at the monkey s waist height and the monkey was trained to retrieve a food pellet using a rake-shaped tool. Following 5 minutes of active tool-use, it was observed that there were expansive increases in activation of hand-specific bi-modal neurons, such that these neurons responded when visual stimuli were presented on the tool. These findings suggest that, through interaction, the visual receptive field, originally confined to the hand was expanded along the length of the tool. The somatosensory receptive field was unchanged during tool-use. Additionally, following the tool-use condition, the monkeys continued to retrieve food without the use of the tool. After 3 minutes following cessation of tool-use, the visual receptive field contracted back to its original size on the hand. Overall, this work supported the hypothesis that non-corporeal objects can be represented as a corporeal object (part of the body) as if the tool is a transient extension of the limb (Iriki et al., 1996). In this case, even though the tool was considered part of extra-personal space prior to tool-use, the tool became represented as within peri-personal or action-space after tool-use. Bi-modal neuron activation and modulation in these areas during tool-use (object manipulation) further identified neural substrates and interconnected circuits between the VIP and PMv that underlie the

17 9 visuotactile representation of peripersonal space during object manipulation (Graziano & Gross, 1998). Since these central neurophysiological observations, research has been conducted to further define the parameters of body schema plasticity. This work has provided the initial idea that we incorporate objects into our representation of the body during and following purposeful use (referred to as tool embodiment). This concept has since been extended to neurophysiological and behavioural observations in the human population that support the homologous parietofrontal networks activated during object manipulation (Johnson-Frey, 2003; Maravita, & Iriki, 2004). For example, there is fmri evidence of increased activity in the intraparietal sulcus during object manipulation (Fincham, Carter, Van Veen, Stenger, & Anderson, 2002), as well as observations of tool manipulation deficits in people with known lesions to proposed body representative areas (Berti & Frassinetti, 2000), that support previous findings of the structures involved in prehension and critically, how activations in these areas are modulated during and following tool manipulation (Johnson-Frey, 2003). Further, due to several constraints of replicating complex neurophysiological designs, such as using single neuron recordings in human populations, a number of behavioural studies in humans has provided support for the tool embodiment hypothesis and further determine the characteristics of human body schema plasticity. Of specific relevance to the present thesis, this line of research has sought to identify the conditions required to elicit these changes in cortical representations and the extent to which the representations can be modulated. The following section will provide a review of this body of literature Behavioural Evidence Initial behavioural evidence that suggested a more complex, longer lasting relationship between body schema and objects designed for a specific use (i.e., tools) came from a patient who had unawareness of left arm paralysis (Aglioti, Salvatore, Berlucchi, & Giovanni, 2010). As a result of a brain lesion, the individual was only able to identify the significance of their wedding ring when the ring was off of their hand. This joint representation of the left hand and ring suggested that the ring had become part of the associated left hand body schema and was therefore also susceptible to the associated dysfunction when it was on the body part. Since this observation, a variety of behavioural paradigms, using special and neuro-typical populations,

18 10 have been used to deduce information about the functional characteristics underlying tool embodiment that compliment previous neurophysiological works. These behavioural paradigms have employed perceptual judgements such as perceived sensation of stimuli, target distance estimates and arm length estimates with the underlying assumption that the incorporation of the tool into our body representation results in perceptual consequences which can then reveal characteristics of tool embodiment processes (Berti & Frassinetti, 2000; Farnè & Làdavas, 2000; Kao & Goodale, 2009; Bourgeois, Farnè & Coello, 2014). Evidence of separate neural systems that encode peri-personal from the extra-personal space was derived from tool-use research in patients with spatial neglect (Maravita, Husain, Clarke, & Driver, 2001; Farnè & Làdavas, 2001; Berti & Frassinetti, 2000). Spatial neglect is a consequence of a brain lesion that is characterized by the inability to detect stimuli that is presented contralateral to the side of brain damage (Halligan & Marshall, 1991). Patients were studied on their ability to dissociate between near and far space by identifying the point of bisection between 2 lines in near and far space with a laser pointer or a tool (Maravita, Husain, Clarke, & Driver, 2001). Patients with neglect showed 24% point identification error (a line bisection bias towards ipsilesional space) in near space and only 9% error in far space. This dissociation suggests that these two spaces are represented differently and that the near space (more associated with the body) corresponded with more dysfunction as a result of spatial neglect. Interestingly, when the subjects bisected the lines with a hand-held stick (i.e., a tool), identification error in far space increased from 9 to 27% suggesting that the spatial neglect (that is confined to peri-personal space) was extended to incorporate the stick in far (what was extrapersonal) space. Thus, the neglect dysfunction in far space was comparable to that of near space when the tool was held. Another set of studies involved patients with tactile extinction a condition characterized by brain damage that results in patients inability to detect contralesional stimuli only when the stimulus is simultaneously presented with a competing ipsilesional stimulus (Farnè & Làdavas, 2000). A study of tool-use by Farnè and Làdavas (2000) involving people with tactile extinction revealed that when visual stimuli (i.e., light flash) were presented on the end of a rake following goal-directed interaction with said rake, there was more interference with the detection of a simultaneous left tactile stimulation (i.e., hand touch) when compared to detection rates prior to

19 11 rake interaction. It was suggested that this pattern of effects emerged because, before tool interaction, the rake was represented as extra-personal space and therefore was not considered part of the body. Because the rake was not part of the body, the visual stimulus on the rake did not interfere with detection of the tactile stimuli. However, once the rake was incorporated into peri-personal space through tool-use, stronger interference effects of the visual stimulus on the rake to the tactile stimulus were observed because the rake was now represented as an extension of the body. This key observation was foundational to understanding the flexible nature of space representation that is interaction dependent. However, the specific characteristics of tool embodiment, such as the necessary length of exposure to the tool, the type of tool action or how long the modulation persists remained undetermined. Consistent with previous single neuron recording evidence in monkeys, human neurobehavioural research has revealed that this plastic modification of the receptive field is not fixed, and will contract back to its normal length 5-10 minutes after tool-use (Farnè & Làdavas, 2000). This finding suggests that the extension of the coding of space to incorporate the tool can be rapidly modified by performing goal-directed movements with a tool. Also, because this finding indicated that the modulation in the body schema accompanying tool-use could last up to 10 minutes following tool interaction it provided a rationale for employing pre/post tool-use designs for observing perceptual and motor behaviour following tool interaction (Làdavas & Serino, 2008). This finding guided research assessing the changes in behaviour following tool interaction and allowed these paradigms to extend to the neuro-typical population. Maravita, Spence, Jennett and Driver (2002) transferred the stimuli congruency paradigm used in research examining tactile extinction in special populations to neuro-typical individuals with the assumption that visual stimuli presented on the same side of space as tactile stimulation has a larger interference effect (reduced ability to distinguish between stimuli) than when the visual and tactile stimuli are presented on opposite sides of space or different sides of the body. Maravita et al. (2002) asked participants to interact with a stick in each hand, with the visual stimuli located at the end of the stick. The key finding of their study was that when the participants crossed the sticks (without crossing their hands), such that the end of the left club was situated in the right visual field when crossed, an opposite spatial congruency emerged - the visual stimuli in the right visual field (but connected to the left hand by the tool) had larger

20 12 tactile stimuli identification interference effects following active usage of the tool. However, this effect was not observed when participants simply held the tools passively, emphasizing the importance of action-based schema modulation. Though insightful, this action-dependent schema modulation finding did not explore whether the active component was tool-interaction specific as opposed to simply engaging the motor system in a general arm action. Farnè and Làdavas (2000) addressed the issue of action-dependant modulation by adding a control parameter to their study which measured the degree of cross-modal extinction at the end of a rake pre- and post-tool-use. They included a control group that simply pointed towards targets (i.e., point baseline group) that the tool group was required to reach and contact with their rake. Through a comparison of the participants performance, they reported a comparable level of extinction in the pre-tool and point baseline groups (75 and 69%, respectively) and critically, a significant difference in cross-modal extinction between the tool and point groups in the post-test (53 and 75%, respectively). That is, there was a significant decline in the degree of cross-modal extinction following tool interaction, as opposed to a statistically non-significant change in extinction following the simple pointing action. Based on these findings, the authors concluded that active interaction with the tool was a pre-requisite to tool embodiment, rather than solely goal-directed limb movement (e.g., pointing movements without a tool). This study emphasized the need to physically interact with the tool to induce the proposed body schema modulation. However, a limit to this finding is whether or not the tool action must be used in a purposeful and goal-directed way for tool embodiment to take place. The specificity of tool action was further studied to understand whether functionality of the tool could play a role in tool embodiment. Again, using their cross-modal extinction paradigm, Farnè, Iriki and Làdavas (2005) examined whether the absolute or functional length of a tool is incorporated into peri-hand space by measuring the level of cross modal extinction in near, middle and far space (0 cm, 30 cm, 60 cm) before and after participants used one of three tools. The tools consisted of a 30 cm rake, a 60 cm rake and a hybrid 60 cm rake whose functional end (i.e., prongs) were located 30 cm along the length of the tool. With respect to the hybrid tool, they found significantly less cross-modal extinction at the far location (60 cm) when compared to those who used the 60 cm tool. Critically, there was not a statistically significant difference in extinction effect between the 30 cm tool and the hybrid tool at the middle (30 cm)

21 13 target location. This study provided the first evidence that the spatial extent of tool embodiment is determined by the functionally effective length of tool, as opposed to the absolute length. Consistent with this finding, another combined neurophysiological and behavioural study by Tomasino, Weiss and Fink (2012) found an increase in fmri activity in the extrastriate body area when a more appropriate tool (i.e., joystick) was used in a more compatible environment (near space) than when a less appropriate tool (i.e., extended pliers) was used in a less congruent environment (near space). This finding suggests that the neural representation of the body is adapted in a functional manner, depending on tool compatibility. Bourgeois, Farnè and Coello (2014) extended this concept using a behavioural paradigm measuring the perceived reachability of a target pre- and post-tool interaction with a short or long tool. They reported a significant increase in perceived reachability with the long tool (that was physically capable of reaching the targets in the interaction task). Critically, they did not observe a statistically significant change in perceived reachability in participants that interacted with a short tool (that could not reach the targets). This result supports the previous findings in tool compatibility, as well as suggests that the tool must provide a functional advantage (in this case an extension in the arm capable of reaching the targets) to modulate the underlying neural networks and body representation. This finding promotes an interesting area of future research determining what defines a functional advantage and what aspects of the tools properties (aside from increasing reach length), such as grasping capabilities, can modulate the body schema. This issue will become more relevant in the following section on how virtual tools could provide us a functional, rather than strictly physical, advantage. As it relates to the specific mechanisms of tool embodiment, researchers questioned whether improved processing of stimuli at the end of the tool was due to an extension of space representation, an addition of a multi-sensory processing area or simply a shift in attention. Furthermore, investigation of the receptive field/peri-personal space extension in monkeys and extinction human populations supported the theory that, following purposeful use of the rake, the whole length and axis of the tool is incorporated into the body schema rather than as a result of a shift in multi-sensory integration from the hand to the tip of the tool (Iriki, Tanaka, Iwamura, 1996; Farnè & Làdavas, 2000; Berti & Frassinetti, 2000; Bonifazi, Farnè, Rinaldesi & Làdavas, 2007). Behavioural findings of facilitated motor responses (reaction time) to stimuli on a tool

22 14 was observed along the entire length of a tool, as opposed to simply the end of the tool; suggesting that the representation is extended along the entire functional length of the tool (Bonifazi, Farnè, Rinaldesi & Làdavas, 2007). Additionally, another unknown mechanistic issue of tool embodiment was whether the expansion effect was generalized around the entire body, or was specific to the limb interacting with the tool. This issue was recently addressed in a study that examined how tool-use influences the limb-specific coding of the tool with respect to the body schema (Jovanov, Mazalek, Clifton, Nitsche, & Welsh, 2015). In the Jovanov et al. (2015) study, a task that was thought to be sensitive to the coding of body parts in the body schema, the body-part compatibility task (Bach, Peatfield & Tipper (2007), was adapted to explore tool-embodiment. Neurophysiological evidence for the existence of bi-modal mirrors neurons suggest a neural overlap between action observation with selfproduced action (DiPellegrino et al., 1992). Bi-modal neurons are suggested to contribute to imitation and action priming, such that a motor response is facilitated when a similar action is observed. Bach et al. (2007) examined the role of attention in imitation using a unique body-part compatibility paradigm. Participants viewed a static image of an action end-state (a man kicking a soccer ball or a man typing on a keyboard) and were required to respond as fast as possible with a finger press or foot press to the red and blue targets that were presented on the image, respectively. The target circles were presented on either the action-site (actors foot or hand) or to an irrelevant-action site (head of the actor). The experimenters reported that hand responses of the participants were shorter when observing a hand action than when observing a foot action and vice versa for foot responses. This pattern of results supports the theory of action priming, such that a compatibility between the action observed and the action performed by the participant facilitates the compatible responses and/or inhibits the incompatible response. This interpretation is predicated on the idea that action observation excites the homologous region in the brain, which facilitates response initiation with the similar limb. Additionally, this study addressed the critical role of spatial attention, such that compatibility effects only occurred when attention was directed to the action features (kicking or hand typing) and did not appear when they were directed to non-action features (head of the actor). Jovanov et al. (2015) used an adapted version of the body-part compatibility paradigm of Bach et al. (2007) to assess body-part coding and limb-specific tool embodiment. Participants

23 15 viewed a profile image of a woman holding a rake and were instructed to use their hand or foot to respond to targets (red or blue, respectively) presented on the foot, hand or end of the handheld rake in the observed image. This compatibility task was performed prior to and following a goal-directed tool interaction (physical manipulation with the rake presented in the image). Prior to tool interaction, the experimenters observed body-part compatibility effects consistent with Bach et al. (2007), such that hand responses to targets presented on the hand were shorter than hand responses to targets presented on the foot or rake, and vice versa for foot responses. It was thought that the body-part compatibility effect emerged because the presentation of a target on a limb called attention to the limb (Bach et al., 2007) and excited the representation of that limb in the body schema. This body-part specific activation in the body schema then made the processing and generation of responses with that limb more efficient relative to other limbs. That is, when the location of the target was on a different part of the body from the responding limb, the activation of the body part in the body schema did not facilitate the activation and generation of the appropriate response. Thus, when the responding limb and location of the target are coded as the same body part, we observe shorter RTs than when the responding limb and target location are coded as different body parts. In this way, the patterns of RTs are thought to indicate that the participant is engaging in self-other matching and limb mapping processes (e.g., Jovanov et al., 2015). The critical pattern of findings of the Jovanov et al. (2015) study emerged when comparing the pre/post-interaction changes to RTs for hand responses to targets on the rake. Prior to tool interaction, there were longer hand RTs to targets presented on the rake relative to a target on the hand. This RT difference suggests that the tool was initially coded as being external to the hand in the body schema. However, following tool interaction, a significant decrease in hand RTs to the targets on the rake was observed, suggesting that following interaction, the tool became embodied within the body schema and was effectively coded as an extension of the hand. No such change in response times to targets on the hand or foot was observed. Critically, there was no statistically significant reduction in foot response time to (blue) targets on the rake suggesting that the change in response times (i.e., tool embodiment) is limb-specific, such that the extension of body representation following tool interaction is specific to the limb that interacted with the tool, as opposed to an overall, general body schema expansion.

24 16 Most of the previous research has provided support for the tool embodiment hypothesis and provided several precursor characteristics required to adapt the body schema in this specific way. However, a current area of debate amongst tool embodiment literature is the role of intention when interacting with a tool and whether it is required to induce modulation. Witt, Proffitt and Epstein (2005) tested perceived target distance following active, goal-directed use of a hand-held tool. Participants viewed two targets appearing simultaneously and then following removal of the targets the participants were required to place two objects at the same distance that the targets had just appeared. The results were that targets beyond the reach of the hand appeared closer following tool interaction than before tool interaction, suggesting that the perception of the environment was altered following tool-use. Additionally, they did not observe a statistically significant difference in distance between targets following static holding of the tool that the participants had no intention to use. This characteristic of intentional use preceding tool embodiment modulation in space representation supported previous explanations to findings of consistent changes in space representation (deduced through visual distance estimates) when holding a tool (Witt & Proffitt, 2008; Osiurak, Morgado & Palluel-Germain, 2012). However, the results of a study by Bourgeois, Farnè, and Coello (2014) did not support earlier conclusions because they did not observe changes in their dependant measure (perceived reachability of a target) when the participants statically held the tool, but were clearly told they were going to use it. Thus, even though participants had the intention to use the tool, this intention did not affect the perception of space. As well, Grave, Brenner and Smeets (2011) had participants either verbally estimate how far an object was from their body or touch the objects location with either a hand or stick. No differences in verbal distance judgments or touching responses were found between the blocks in which the stick or hand was used. This discrepancy in behavioural findings following static tool-use casts critical doubt on distance estimate paradigms and the role of intention in tool embodiment processes. Overall, a strong body of neurophysiological and behavioural research has supported the hypothesis of an action-oriented body schema that acts to provide a dynamic and continuous representation of our body with respect to our environment and the objects we use to act in the environment. Through the scientific investigation of tool-use, it has been suggested that the body schema is dynamically shaped by the environment in order to improve our capability for action. Evidence from tool interaction research suggests that the body schema expands to incorporate

25 17 the tool within the body schema representation (see Maravita & Iriki, 2004 for review). Specific neurophysiological and behavioural tool-use research has gradually defined and refined the characteristics of tool embodiment, such that peri-personal space extension is dependent on active (Farnè & Làdavas, 2000; Maravita et al., 2002), goal-directed (Farnè & Làdavas, 2000), functionally effective (Farnè, Iriki & Làdavas, 2005; Bourgeois, Farnè & Coello, 2014), limbspecific interaction with the physical tool (Jovanov et al., 2015). This body schema extension is adapatable and constantly changing. Thus, it also retracts back to the length of the limb following a period of time after tool interaction (Farnè & Làdavas, 2000). Tool-use research provides an example of acute, action-generated plasticity in the brain that, in this case could potentially function to provide more relevant sensory information from the environment to improve motor planning and action outcomes. 1.3 Tool Embodiment in a Virtual Environment Tool-based interactions have provided information on the neural plasticity of body representation and space coding (Iriki et al., 1996; Farnè & Làdavas, 2000; Maravita et al., 2001). The most consistent findings suggest that the change in the neural network following tooluse is compatible with the notion that tools can become incorporated into our body schema; effectively as an extension of our physical effector. Future questions regarding the mechanisms and characteristics of tool embodiment can be extended to understanding body schema modification in virtual tool and environment interactions. We are engaging more and more with virtual interfaces in all areas of our life, from video conferencing in board meetings, launching a golf ball into a simulated screen, playing a video game or writing a thesis. We are able to engage with a world of possibilities even at the tip of our fingers. While acting as a tip of a finger, something as simple as a computer mouse or keyboard is a tool-medium in which to connect with our virtual environment. Thus, the question arises: if we are using these tools in a goaldirected action that is completed in a virtual interface and environment, do we still engage the same mechanisms that induce the proposed tool embodiment and body schema changes that occur through tool-use in the real world? Despite the relevance of this virtual tool-embodiment for such endeavors of surgical skills training in simulators, there has been limited research to investigate the underlying mechanisms of whether modification of the body schema follows the interaction with a tool in a

26 18 virtual environment as it does in real tool-use. Following their seminal research study supporting tool embodiment in macaque monkeys (Iriki et al., 1996), Iriki, Tanaka, Obayashi and Iwamura (2001) investigated whether tool embodiment effects transferred to a virtual context. As such, monkeys interacted with blocks using a hand-held tool. The movements of the monkey were recorded and displayed in real-time on a monitor that was viewed by the participants. Following two weeks of tool acquisition, Iriki et al. (2001) employed single neuron recordings (of bimodal neurons in the IPS) in macaque monkeys and observed cellular responses to visual receptive fields associated with the monitor, such that the monkeys were able to extend their selfrepresentation onto the virtual screen and recognize themselves in the video recording in order to effectively interact with the tool and objects. As well, the researchers observed a change in cellular responses in visual receptive fields associated with the monitor such that following tool interaction there was an extension along the length of the tool. These new findings were comparable to those of their original study (Iriki et al., 1996) supporting tool embodiment following active use. As such, this study supports the hypothesis that tool embodiment can occur when simply viewing the hand-tool interactions on a virtual screen. The study did not, however, investigate other virtual circumstances such as when the tool or objects to be interacted with are purely virtual (i.e., moving a computer mouse to control a curser that interacts with virtual icons). In these circumstances, participants are unable to physically interact with the virtual device (curser) or object (icons) even though the physical input device (mouse) allows them to manipulate these virtual items. The specific research question of whether these types of virtual tools facilitate body schema modulation and tool embodiment processes comparable to real tool interactions have yet to be fully understood. However, there is some recent research that suggests that there are defined characteristics of the virtual environment that are essential for facilitating the participant s interaction with the virtual tool (Gozli & Brown, 2011). The most common paradigm in human body schema and real/virtual tool-use studies measures the perceptual processing and response time to recognizing visual stimuli on or around the hand (Kao & Goodale, 2009). In these experiments, tool embodiment was indicated by an increased efficiency in responding to targets presented near the tool (Kao & Goodale, 2009; Berti and Frassinetti, 2000). The results of these studies support the theory that, as a result of our body schema and peri-personal space coding in neural networks, there is enhanced perceptual processing to stimuli appearing on or near the hand, as well as on a

27 19 manipulated hand held rake following use (Kao & Goodale, 2009) or even on a virtual limb (Brown, Morrissey, & Goodale, 2009). Gozli and Brown (2011) investigated the hypothesis that real tool embodiment enhances perceptual processing to targets presented in peri-personal space by using a virtual tool (i.e., computer mouse) and manipulating the spatial and temporal properties of the hand-curser motion. Participants were exposed to three different spatial mappings between the movements of the hand and on-screen mouse curser, including a familiar motion (consistent hand and curser spatial and temporal movement), unfamiliar motion (opposite hand and curser spatial and temporal movement) and no familiar motion condition (no consistent pattern between hand and curser motion). They tested the participants ability to detect the onset of mouse-curser motion and found significantly shorter responses when participants had agency and control over the mouse (familiar motion condition) than the other two motion conditions. In contrast, there was no response facilitation when the hand-cursor mapping was unfamiliar or not familiar. The authors concluded that, through use, the incorporation of the virtual tool into peripersonal space depends on the participants' ability to control the movements of the tool and accurately predict the spatiotemporal properties of the tools' movements in response to the users' movements. In addition, another property attributed to an increased perceptual processing of a limb in a virtual environment is the recognition of one s own hand, even if it is displayed on a virtual screen far away from the body (Longo & Haggard, 2009). Overall, there is preliminary evidence for virtual tool embodiment and this evidence leads to intriguing ideas about the limits of dynamic expansion of our body schema, as well as highlights the need for further exploration to expand the basic knowledge of how virtual environments and virtual interactions effect our movement planning and execution. Because of our evolving environment and subsequent interactions in the real world, such as using virtual simulators to train physical tool-use, it is relevant to investigate the underlying neural processes that allow efficient motor control in virtual environments and whether or not these processes are comparable to real-world tool interactions. Previous studies have suggested some support for virtual tool embodiment (Gozli & Brown, 2011). The purpose of the present work, however was to further explore this concept as well as determine if tool embodiment in a virtual world transfers back to a real tool. This latter information is necessary to determine how we can take advantage of using virtual tools to advance human motor learning, planning and action capabilities.

28 The Current Project: Purpose and Specific Hypotheses The purpose of the experiment reported in the current thesis was to investigate the dynamic properties of the body schema (action-oriented body representation) that underlie tool embodiment and its' consequences on action and perception. Critically, the study investigated the contextual role of the environment in which a tool is being used (real or virtual world) on the tool embodiment and schema modulation phenomenon. As such, the study employed a body-part compatibility behavioral paradigm, originally adapted from Bach et al. (2007) and used in the simple tool embodiment investigation by Jovanov et al. (2015). In the present body-part compatibility task, participants viewed a picture of a person holding a rake and responded to targets presented on the foot and hand of the model and on the end of the hand-held rake (Jovanov et al., 2015). Participants were asked to respond with a foot press to blue targets and a thumb press for red targets regardless of the location of the target because the location of the target was irrelevant to the task. The body-part compatibility effect is said to emerge when RTs for targets are shorter when the targets presented on body-parts in the image are compatible with the responding limb (e.g., a red target on the hand of the model in the image) than when the target is presented on a limb that is not compatible with the responding limb (e.g., a red target on the foot). Recall that it is thought that the body-part compatibility effect emerges because the presentation of a target on a limb excites the representation of that limb in the body schema. This body-part specific activation in the body schema then makes the processing and generation of responses with that limb more efficient. When the location of the target is on a different part of the body from the responding limb, the activation of the body part in the body schema does not facilitate (and may interfere with) activating and generating the appropriate response (see Figure 1). Thus, it was predicted that, when the responding limb and location of the target are coded as the same body part, the presentation of the target stimulus will activate the area of the body schema that represents that body part. As a result of this activation, there will be a more efficiently activated response and shorter RTs when the responding limb and the location of the target are coded as the same limb (compatible) than when they are not coded as the same limb (incompatible). In this way, the patterns of RTs that emerge for targets presented on different limbs is thought to indicate the coding of the different limbs RTs are relatively short when the location of the

29 21 target is coded similarly to the responding limb and are relatively long when the location of the target is coded as different from the responding limb (e.g., Jovanov et al., 2015). * * Figure. 1 Mean response time in milliseconds (ms) as a function of Responding Limb and Target Location. Standard error of the mean (S.E.M.) bars are shown. The data depicted here are from Jovanov et al. (2015). This pattern of RTs is a demonstration of the basic body-part compatibility effect. A similar pattern of findings is anticipated in the present work. In the present study, the patterns of RTs in a body-part compatibility task were used as an index of the coding and embodiment of a tool. Neuro-typical human participants completed the body-part compatibility task (including targets presented on the hand, foot, and a rake being held by a model) prior to and after using a simple tool in a goal-directed manner. Participants were required to complete the tool interaction in one of three different tool conditions; using a physical tool, virtual-immersive tool or virtual-keyboard tool. The three tool conditions were selected to investigate two specific variables: the environment in which the tool was interacted in (real or virtual) and the type of action required to manipulate the tool (lifelike movements allowed via Wii tracking device or movements isolated to simple keyboard presses). These two variables were manipulated in accordance to the bi-modal neuron hypothesis that has been

30 22 previously implicated as the neurons suggested to be altered by active tool-use (Iriki et al., 1996). Bimodal neurons respond both to visual and somatosensory input. Therefore, because bimodal neurons are suggested to be altered by tool-use, these properties were manipulated (visual and proprioceptive input) in the three tool conditions in order to investigate whether virtual visual or proprioceptive input alterations have an impact on bimodal neuron changes (receptive field expansion) associated with tool embodiment (Iriki et al., 1996) Predictions Based on previous work using this body-part compatibility task (Bach et al. 2007; Welsh et al. 2014, Jovanov et al., 2015) several specific predictions were developed. First, following previous work on the body-part matching process as assessed by the body-part compatibility effect (Bach et al. 2007; Welsh et al. 2014), it was expected that thumb (hand) RTs would be shorter when the red target was presented on the hand of the model than when it was on the foot of the model (and vice versa for foot responses) (Figure 1). Such a pattern of RTs has been suggested to indicate that the participants are engaging in a self-other body matching process in which the body of the model is mapped to and activates the body schema of the observer. The critical marker of tool embodiment for the present study was any potential change in RTs for hand responses to targets presented on the rake after tool experience. Thus, the second prediction regards RTs for hand responses to targets on the hand of the model and on the end of the rake the model was holding. Because the tool was not likely to be part of the body schema prior to tool-use, a version of the body-part compatibility effect should be observed for hand RTs prior to tool-use; with hand RTs for targets on the rake being longer than those to targets on the hand. RTs for hand responses to targets on the rake should be longer than RTs for hand responses to targets on the hand because, prior to tool-use, the rake was not coded as part of the hand and, as a result, did not activate the hand area in the body schema. Hand RTs to targets on the hand should not change following tool-use because the image of the hand was already coded in the hand region of the participants body schema prior to the tool interaction and should not be affected by tool-use (e.g., Kao and Goodale, 2009). If tool embodiment occurred following simple tool-use (regardless of the type of tool used), then there should be a decrease in hand RTs to the target on the rake following tool-use because the hand

31 23 representation has been expanded to incorporate the rake and the rake becomes coded as part of the hand (Figure 2). Thus, hand RTs for rake targets may be comparable to hand RTs for hand targets after tool-use if the hand area in the schema has been expanded to incorporate the rake. Such a pattern of effects were previously observed in Jovanov et al. (2015). Alternatively, if the tool interaction was not sufficient to modify the body schema, then RTs for rake targets should not change after training. In this case, the version of the body-part compatibility effect would be preserved with hand RTs for rake targets continuing to be significantly longer than hand RTs to targets on the hand (figure 3). * * n.s Figure. 2 Mean response time in ms as a function of Time and Target Location for hand responses. S.E.M. bars are shown. The data depicted here are from Jovanov et al. (2015). A similar pattern of findings is anticipated in the present work if tool embodiment can occur for a virtual tool.

32 24 n.s * * Figure. 3 Mean response time in ms as a function of Time and Target Location for hand responses. S.E.M. bars are shown. The data depicted here are adapted from Jovanov et al. (2015). A similar pattern of findings is anticipated in the present work if tool embodiment cannot occur for a virtual tool. Of greatest theoretical interest were the final experimental predictions regarding the analysis of responses after the interaction with a physical versus virtual tools. Because previous research examining physical and virtual tool-use environments is limited, it is unclear whether or not the virtual tool will yield comparable results to previously explored physical tool interactions and the associated limb-specific body schema modulation (Jovanov et al., 2015). Additionally, within the virtual environments, the amount of proprioceptive input and its relationship to the visual input of the virtual arm and rake was varied using two virtual conditions. In the virtual-immersive condition, participants arm and tool movements were comparable to physical tool interactions (tracked via Wii device). Thus, in this condition, proprioceptive and virtual visual input were comparable to that experienced in real tool interactions. In the virtual-keyboard condition, the arm of the participant was relatively still

33 25 because they controlled the movement of the virtual limb and rake via key presses. Thus, proprioceptive and virtual visual input were very different from that experienced in physical tool interaction. It was unclear whether this difference in proprioceptive input would result in different embodiment effects. However, based on previous research that suggest the importance of active tool engagement (Farnè & Làdavas, 2000; Maravita et al., 2002) in tool embodiment, it was predicted that participants that actively interacted with the virtual tool (as opposed to simple keyboard actions) would be more likely to experience tool embodiment. However, as suggested in Gozli and Brown (2011), if participants were able to experience a strong agency between the simple keyboard movements and the movements on the virtual screen, it was possible that participants would be able to experience body schema modulation and tool embodiment effects following keyboard interactions as well.

34 26 Chapter 2 Methods 2.1 Participants Thirty-Eight individuals (11 men, 27 women; Physical Tool Interaction Group: 2 men, 9 women; Virtual-Immersive Interaction Group: 4 men, 9 women, Virtual-Keyboard Tool Interaction Group: 5 men, 9 women) from the University of Toronto community volunteered to participate in the study. Each participant was between the age of 18 and 35 years, right-hand dominant (self-report), did not have any known cognitive, brain or movement disorders, and were naïve to the purpose of the study. Each participant provided informed consent prior to beginning data collection and were financially compensated ($10 CAD/hour) upon completion of the study. The procedures conformed to the Helsinki Declaration and were approved by the Office of Research Ethics at the University of Toronto. 2.2 Apparatus, Tasks, and Procedures The study consisted of 3 phases. Phases 1 and 3 consisted of identical choice RT tasks in which participants responded with foot or hand responses to target stimuli presented on a screen. The RT tasks were separated by a tool interaction in Phase 2 in which participants learned to use a rake to move a ball through a course. During Phase 2, participants completed the tool training in one of three tool-type conditions: physical; virtual-immersive; or virtual-keyboard tool. These tool-type conditions are further explained in section Participants completed the RT tasks immediately prior to and immediately after the tool interaction. All 3 phases were completed in a single session that lasted minutes. Following Phase 3 of data collection, participants were asked to answer a question about their previous video game experience. The video game experience question provided four options about how often they played computerbased video games or game console video games (excluding phone or tablet apps) on a scale of 0 hours per week, 1-5 hours per week, 5-10 hours per week, or >10 hours per week.

35 Pre-/Post-Training RT Task Completed in Phases 1 and 3 Participants sat in a chair at a desk approximately 70 cm away from a computer screen. All stimuli were presented on the 23 LCD screen. Stimuli consisted of a real-life profile picture of a young adult woman holding a rake perpendicular to her body with her outstretched right arm. The image was digitally separated from the environment in which the picture was originally taken and positioned near the centre of a white background. The digital separation was used to prevent irrelevant information in the background from distracting the participants and to help the target stimuli be more visually distinct. The image was 20 cm in height and 15 cm in width. A single target was presented in each picture. The target was either a blue or red circle (2.5 cm diameter) that was superimposed over the hand, foot or the end of the rake (see Figure 4). The blue and red circles were presented equally on the three locations. The target on the hand was presented 8.5 cm from the fixation point. The target on the end of the rake was presented 13 cm from the fixation point. The target on the foot was presented 13 cm from the fixation point. To prevent attentional shift from the fixation cross in anticipation of a target at a specific location, the image was also equally presented in a rightward or leftward profile orientation. Figure. 4 Examples of the target displays with the target presented on: a) the foot; b) the hand; and c) the rake. Throughout the trials, participants placed their right foot over a pedal placed on the ground and their right thumb over a button in a hand held unit. Participants were told to press the button with the thumb or the pedal with the foot as soon as possible after recognizing that a red

36 28 or a blue circle, respectively, was presented in the picture. Target location was irrelevant to the response. Instruction screens were presented at the beginning of each testing phase in a white font on a black background. A custom program written using E-Prime (2.0) software controlled the presentation of the experimental stimuli and recorded the timing and identification of the responses. Before completing the first pre-test phase, each participant completed a familiarization session of 6 randomized images (equal red and blue targets and rightward and leftward orientation). Each testing phase (Phase 1 and 3) consisted of 2 blocks of 48 trials of the choice response task (96 trials in both the pre- and the post-training testing). The 48 trials in each block consisted of 4 instances of the 12 trials derived via the factorial combinations of target (red, blue), limb (hand, foot, rake) and orientation (facing leftward or rightward) presented in a random order. At the beginning of each trial, the word READY was presented in the middle of the white screen with black font for 1000 ms. A black fixation cross directed and maintained attention to the middle of the screen during the foreperiod. Target images were presented randomly ms after the presentation of the fixation cross to discourage anticipation. The picture was positioned with respect to the fixation cross such that the hand, foot, and end of the rake were positioned a similar distance from the centre of the cross Rake Interactions Completed in Phase 2 Participants completed the tool interaction immediately after completing the pre-test (Phase 1). Participants were randomly divided and assigned to one of three tool interaction groups: physical; virtual-immersive; or virtual-keyboard Physical Tool Interaction Group Immediately after the pre-test (and immediately before the post-test), participants were asked to grasp the same 66 cm plastic rake that was displayed in the stimuli images of the choice RT task with their right hand. The participant then stood in the same orientation as the model in the image (holding the rake outstretched and perpendicular to their body) in front of an 80 (w) x 48 (d) x 80 (h) cm wood table. Participants were asked to manipulate a 6.5 cm diameter tennis ball through a complex path outlined on the table with green tape (see Figure 5). The complex

37 29 path resembled a horizontally-oriented figure 8. Participants were asked to use the rake to move the ball through the course and were asked to stay within the tape boundaries of the path. Excessive movements outside of these boundaries were corrected by the experimenter. Balls that were dropped from the table were placed back on the table where they had fallen off. During training, participants were asked to move at a comfortable pace and to stay as accurately as possible within the boundaries. Participants repeated the movement pattern in 4 blocks of 2.5 minutes (2 blocks in each clockwise and counter-clockwise direction alternately). Figure. 5 Image of an exemplar participant completing the training in the physical tool group. Participants controlled the rake to move the ball around the course. After 10 minutes of training, participants were required to complete a proficiency test before continuing to Phase 3. Because tool embodiment is thought to occur through practice, familiarity, and efficiency with the tool, the proficiency test was included to ensure that each participant was met some basic familiarity and proficiency criteria. The test measured the amount of full circuits the participant could complete in 30 seconds. The test was repeated in both clockwise and counter-clockwise directions and the criterion of proficiency (based on pilot testing) was set at 4 full circuits. All participants passed the proficiency test on the first attempt Virtual-Immersive Tool Interaction Group Participants assigned to the virtual-immersive completed a similar rake-and-ball moving interaction as completed by the physical tool group, but in a virtual environment. During the interaction, the physical contact and interaction with the rake was identical to that of the physical

38 30 tool group (i.e., participants held a plastic handle attached to a plastic rod), but the table and end of the rake was covered by a tarp such that participants could not see the outcome of their movement of the rake end on the table. Instead of seeing the end of the tool and the table, participants watched a projection screen in front of them. The visual output on the screen was of an interactive environment that included an arm, rake, table, ball and complex course similar to that of the one the physical group experienced. The directional movement of the end of the rake was tracked via a mouse attached to the end of plastic handle and rod, and the rotational components of the rake movement was tracked via a Wii remote (using the accelerometer) secured to the shaft of the rake tool (see Figure 6). Another difference of this protocol from that of the physical tool group was that participants did not physically interact with a physical ball underneath the tarp. Instead, the movement of the rake in the virtual world moved a ball in the virtual world. Importantly, the physics of the rake and ball in the virtual world was programmed to be as close as possible to the movement of the physical rake and ball such that the participants felt as though their physical rake movement had a similar consequence on the movement of the virtual ball. Hence, in this condition, participants completed the rake interaction with movements comparable to those made in the physical tool condition, with the visual display being the critical difference.

39 31 Figure. 6 Images of the tool interaction system for the virtual-immersive group (left panel) and of the virtual environment for the tool interaction (right panel). Movements of the tool interaction system under a tarp were accurately translated to the projected movements of the arm and rake in the virtual environment. After 10 minutes of training, proficiency was assessed as for the physical tool interaction group. Based on pilot testing that revealed that the virtual interaction was more difficult than the physical interaction, the criterion of proficiency was set at 1 full circuit. All participants passed the proficiency test on the first attempt Virtual-Keyboard Tool Interaction Group Participants assigned to the virtual-keyboard group viewed the same virtual output as presented to virtual-immersive group (virtual arm, rake, table, ball and course) except that it was presented on a 23 LCD computer monitor. The critical difference, however, was that people controlled the movement of the rake using only a keyboard (see Figure 7). Participants used 6 buttons on the number pad of a keyboard to move the rake. The movement direction-to-key mapping was the following: left (4), right (6), forwards (8), backwards (2), left tilt (7) and right tilt (9). In this condition, participants did not physically interact with a tool, and, as such, did not physically move the tool in any comparable way to the other two conditions. As such, the critical manipulation in this group, relative to the physical and virtual-immersive conditions, was the disconnect between the motor output and the proprioceptive and visual inputs the participants would receive.

40 32 Figure. 7 Image of an exemplar participant completing the training in the virtual-keyboard group. Participants pressed the buttons on the number pad of the keyboard to move the rake and ball around the virtual course. After 10 minutes of training, participants were required to complete a proficiency test identical to the virtual-immersive group (i.e., 1 full circuit in 30 s in each of the clockwise and counter-clockwise directions). All participants passed the proficiency test on the first attempt.

Coding of Far and Near Space in Neglect Patients

Coding of Far and Near Space in Neglect Patients NeuroImage 14, S98 S102 (2001) doi:10.1006/nimg.2001.0815, available online at http://www.idealibrary.com on Coding of Far and Near Space in Neglect Patients Anna Berti,*,1 Nicola Smania, and Alan Allport

More information

Peripheral facial paralysis (right side). The patient is asked to close her eyes and to retract their mouth (From Heimer) Hemiplegia of the left side. Note the characteristic position of the arm with

More information

Human Paleoneurology and the Evolution of the Parietal Cortex

Human Paleoneurology and the Evolution of the Parietal Cortex PARIETAL LOBE The Parietal Lobes develop at about the age of 5 years. They function to give the individual perspective and to help them understand space, touch, and volume. The location of the parietal

More information

Extending or projecting peripersonal space with tools? Multisensory interactions highlight only the distal and proximal ends of tools

Extending or projecting peripersonal space with tools? Multisensory interactions highlight only the distal and proximal ends of tools Neuroscience Letters 372 (2004) 62 67 Extending or projecting peripersonal space with tools? Multisensory interactions highlight only the distal and proximal ends of tools Nicholas P. Holmes a,, Gemma

More information

Mirror neurons. Romana Umrianova

Mirror neurons. Romana Umrianova Mirror neurons Romana Umrianova The functional role of the parieto-frontal mirror circuit: interpretations and misinterpretations Giacomo Rizzolatti and Corrado Sinigaglia Mechanism that unifies action

More information

Topic 11 - Parietal Association Cortex. 1. Sensory-to-motor transformations. 2. Activity in parietal association cortex and the effects of damage

Topic 11 - Parietal Association Cortex. 1. Sensory-to-motor transformations. 2. Activity in parietal association cortex and the effects of damage Topic 11 - Parietal Association Cortex 1. Sensory-to-motor transformations 2. Activity in parietal association cortex and the effects of damage Sensory to Motor Transformation Sensory information (visual,

More information

Visuo-tactile integration

Visuo-tactile integration Multisensory Peripersonal Illusions Corps et cognition: l'embodiment Corrado Corradi-Dell Acqua corrado.corradi@unige.ch Theory of Pain Laboratory Visuo-tactile integration Multisensory receptive fields

More information

Grab it! Biased attention in functional hand and tool space

Grab it! Biased attention in functional hand and tool space Attention, Perception, & Psychophysics 2010, 72 (1), 236-245 doi:10.3758/app.72.1.236 Grab it! Biased attention in functional hand and tool space CATHERINE L. REED University of Denver, Denver, Colorado

More information

Auditory Peripersonal Space in Humans: a Case of Auditory Tactile Extinction

Auditory Peripersonal Space in Humans: a Case of Auditory Tactile Extinction Neurocase (2001) Vol. 7, pp. 97 103 Oxford University Press 2001 Auditory Peripersonal Space in Humans: a Case of Auditory Tactile Extinction Elisabetta Làdavas 1, Francesco Pavani 1,2 and Alessandro Farnè

More information

Perception of Faces and Bodies

Perception of Faces and Bodies CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE Perception of Faces and Bodies Similar or Different? Virginia Slaughter, 1 Valerie E. Stone, 2 and Catherine Reed 3 1 Early Cognitive Development Unit and 2

More information

Dr. Mark Ashton Smith, Department of Psychology, Bilkent University

Dr. Mark Ashton Smith, Department of Psychology, Bilkent University UMAN CONSCIOUSNESS some leads based on findings in neuropsychology Dr. Mark Ashton Smith, Department of Psychology, Bilkent University nattentional Blindness Simons and Levin, 1998 Not Detected Detected

More information

Giacomo Rizzolatti - selected references

Giacomo Rizzolatti - selected references Giacomo Rizzolatti - selected references 1 Rizzolatti, G., Semi, A. A., & Fabbri-Destro, M. (2014). Linking psychoanalysis with neuroscience: the concept of ego. Neuropsychologia, 55, 143-148. Notes: Through

More information

Motor Systems I Cortex. Reading: BCP Chapter 14

Motor Systems I Cortex. Reading: BCP Chapter 14 Motor Systems I Cortex Reading: BCP Chapter 14 Principles of Sensorimotor Function Hierarchical Organization association cortex at the highest level, muscles at the lowest signals flow between levels over

More information

Neurophysiology of systems

Neurophysiology of systems Neurophysiology of systems Motor cortex (voluntary movements) Dana Cohen, Room 410, tel: 7138 danacoh@gmail.com Voluntary movements vs. reflexes Same stimulus yields a different movement depending on context

More information

Supporting Information

Supporting Information Supporting Information Lingnau et al. 10.1073/pnas.0902262106 Fig. S1. Material presented during motor act observation (A) and execution (B). Each row shows one of the 8 different motor acts. Columns in

More information

The Relation Between Perception and Action: What Should Neuroscience Learn From Psychology?

The Relation Between Perception and Action: What Should Neuroscience Learn From Psychology? ECOLOGICAL PSYCHOLOGY, 13(2), 117 122 Copyright 2001, Lawrence Erlbaum Associates, Inc. The Relation Between Perception and Action: What Should Neuroscience Learn From Psychology? Patrick R. Green Department

More information

Parietofrontal Circuits for Action and Space Perception in the Macaque Monkey

Parietofrontal Circuits for Action and Space Perception in the Macaque Monkey NeuroImage 14, S27 S32 (2001) doi:10.1006/nimg.2001.0835, available online at http://www.idealibrary.com on Parietofrontal Circuits for Action and Space Perception in the Macaque Monkey Massimo Matelli

More information

How do individuals with congenital blindness form a conscious representation of a world they have never seen? brain. deprived of sight?

How do individuals with congenital blindness form a conscious representation of a world they have never seen? brain. deprived of sight? How do individuals with congenital blindness form a conscious representation of a world they have never seen? What happens to visual-devoted brain structure in individuals who are born deprived of sight?

More information

Attention: Neural Mechanisms and Attentional Control Networks Attention 2

Attention: Neural Mechanisms and Attentional Control Networks Attention 2 Attention: Neural Mechanisms and Attentional Control Networks Attention 2 Hillyard(1973) Dichotic Listening Task N1 component enhanced for attended stimuli Supports early selection Effects of Voluntary

More information

Disorders of Object and Spatial perception. Dr John Maasch Brain Injury Rehabilitation Service Burwood Hospital.

Disorders of Object and Spatial perception. Dr John Maasch Brain Injury Rehabilitation Service Burwood Hospital. Disorders of Object and Spatial perception Dr John Maasch Brain Injury Rehabilitation Service Burwood Hospital. Take Home Message 1 Where there are lesions of the posterior cerebrum and posterior temporal

More information

Manuscript. Do not cite. 1. Mirror neurons or emulator neurons? Gergely Csibra Birkbeck, University of London

Manuscript. Do not cite. 1. Mirror neurons or emulator neurons? Gergely Csibra Birkbeck, University of London Manuscript. Do not cite. 1 Mirror neurons or emulator neurons? Gergely Csibra Birkbeck, University of London Mirror neurons are cells in the macaque brain that discharge both when the monkey performs a

More information

Chapter 3: 2 visual systems

Chapter 3: 2 visual systems Chapter 3: 2 visual systems Overview Explain the significance of the turn to the brain in cognitive science Explain Mishkin and Ungerleider s hypothesis that there are two distinct visual systems Outline

More information

LEAH KRUBITZER RESEARCH GROUP LAB PUBLICATIONS WHAT WE DO LINKS CONTACTS

LEAH KRUBITZER RESEARCH GROUP LAB PUBLICATIONS WHAT WE DO LINKS CONTACTS LEAH KRUBITZER RESEARCH GROUP LAB PUBLICATIONS WHAT WE DO LINKS CONTACTS WHAT WE DO Present studies and future directions Our laboratory is currently involved in two major areas of research. The first

More information

Selective bias in temporal bisection task by number exposition

Selective bias in temporal bisection task by number exposition Selective bias in temporal bisection task by number exposition Carmelo M. Vicario¹ ¹ Dipartimento di Psicologia, Università Roma la Sapienza, via dei Marsi 78, Roma, Italy Key words: number- time- spatial

More information

How Far Away Is That? It Depends on You: Perception Accounts for the Abilities of Others

How Far Away Is That? It Depends on You: Perception Accounts for the Abilities of Others Journal of Experimental Psychology: Human Perception and Performance 2015, Vol. 41, No. 3, 000 2015 American Psychological Association 0096-1523/15/$12.00 http://dx.doi.org/10.1037/xhp0000070 OBSERVATION

More information

Selective Attention. Modes of Control. Domains of Selection

Selective Attention. Modes of Control. Domains of Selection The New Yorker (2/7/5) Selective Attention Perception and awareness are necessarily selective (cell phone while driving): attention gates access to awareness Selective attention is deployed via two modes

More information

Attention Response Functions: Characterizing Brain Areas Using fmri Activation during Parametric Variations of Attentional Load

Attention Response Functions: Characterizing Brain Areas Using fmri Activation during Parametric Variations of Attentional Load Attention Response Functions: Characterizing Brain Areas Using fmri Activation during Parametric Variations of Attentional Load Intro Examine attention response functions Compare an attention-demanding

More information

The Effects of Action on Perception. Andriana Tesoro. California State University, Long Beach

The Effects of Action on Perception. Andriana Tesoro. California State University, Long Beach ACTION ON PERCEPTION 1 The Effects of Action on Perception Andriana Tesoro California State University, Long Beach ACTION ON PERCEPTION 2 The Effects of Action on Perception Perception is a process that

More information

Cortical Organization. Functionally, cortex is classically divided into 3 general types: 1. Primary cortex:. - receptive field:.

Cortical Organization. Functionally, cortex is classically divided into 3 general types: 1. Primary cortex:. - receptive field:. Cortical Organization Functionally, cortex is classically divided into 3 general types: 1. Primary cortex:. - receptive field:. 2. Secondary cortex: located immediately adjacent to primary cortical areas,

More information

Motor and cognitive functions of the ventral premotor cortex Giacomo Rizzolatti*, Leonardo Fogassi and Vittorio Gallese*

Motor and cognitive functions of the ventral premotor cortex Giacomo Rizzolatti*, Leonardo Fogassi and Vittorio Gallese* 149 Motor and cognitive functions of the ventral premotor cortex Giacomo Rizzolatti*, Leonardo Fogassi and Vittorio Gallese* Recent data show that the ventral premotor cortex in both humans and monkeys

More information

Rules of apparent motion: The shortest-path constraint: objects will take the shortest path between flashed positions.

Rules of apparent motion: The shortest-path constraint: objects will take the shortest path between flashed positions. Rules of apparent motion: The shortest-path constraint: objects will take the shortest path between flashed positions. The box interrupts the apparent motion. The box interrupts the apparent motion.

More information

Overview of Questions

Overview of Questions Overview of Questions What are the sensors in the skin, what do they respond to and how is this transmitted to the brain? How does the brain represent touch information? What is the system for sensing

More information

ArteSImit: Artefact Structural Learning through Imitation

ArteSImit: Artefact Structural Learning through Imitation ArteSImit: Artefact Structural Learning through Imitation (TU München, U Parma, U Tübingen, U Minho, KU Nijmegen) Goals Methodology Intermediate goals achieved so far Motivation Living artefacts will critically

More information

COGS 101A: Sensation and Perception

COGS 101A: Sensation and Perception COGS 101A: Sensation and Perception 1 Virginia R. de Sa Department of Cognitive Science UCSD Lecture 6: Beyond V1 - Extrastriate cortex Chapter 4 Course Information 2 Class web page: http://cogsci.ucsd.edu/

More information

The Physiology of the Senses Chapter 8 - Muscle Sense

The Physiology of the Senses Chapter 8 - Muscle Sense The Physiology of the Senses Chapter 8 - Muscle Sense www.tutis.ca/senses/ Contents Objectives... 1 Introduction... 2 Muscle Spindles and Golgi Tendon Organs... 3 Gamma Drive... 5 Three Spinal Reflexes...

More information

Observational Learning Based on Models of Overlapping Pathways

Observational Learning Based on Models of Overlapping Pathways Observational Learning Based on Models of Overlapping Pathways Emmanouil Hourdakis and Panos Trahanias Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH) Science and Technology

More information

Psychology of Language

Psychology of Language PSYCH 150 / LIN 155 UCI COGNITIVE SCIENCES syn lab Psychology of Language Prof. Jon Sprouse 03.07.13: Extra slides about animal brains 1 Comparative primatology in search of the biological foundation of

More information

Vision and Action. 10/3/12 Percep,on Ac,on 1

Vision and Action. 10/3/12 Percep,on Ac,on 1 Vision and Action Our ability to move thru our environment is closely tied to visual perception. Simple examples include standing one one foot. It is easier to maintain balance with the eyes open than

More information

Chapter 14: The Cutaneous Senses

Chapter 14: The Cutaneous Senses Chapter 14: The Cutaneous Senses Somatosensory System There are three parts Cutaneous senses - perception of touch and pain from stimulation of the skin Proprioception - ability to sense position of the

More information

Attention and Scene Perception

Attention and Scene Perception Theories of attention Techniques for studying scene perception Physiological basis of attention Attention and single cells Disorders of attention Scene recognition attention any of a large set of selection

More information

Spatial dependency of action simulation

Spatial dependency of action simulation Exp Brain Res (2011) 212:635 644 DOI 10.1007/s00221-011-2748-2 RESEARCH ARTICLE Spatial dependency of action simulation Arjan C. ter Horst Rob van Lier Bert Steenbergen Received: 25 October 2010 / Accepted:

More information

Control of visuo-spatial attention. Emiliano Macaluso

Control of visuo-spatial attention. Emiliano Macaluso Control of visuo-spatial attention Emiliano Macaluso CB demo Attention Limited processing resources Overwhelming sensory input cannot be fully processed => SELECTIVE PROCESSING Selection via spatial orienting

More information

Chapter 5. Summary and Conclusions! 131

Chapter 5. Summary and Conclusions! 131 ! Chapter 5 Summary and Conclusions! 131 Chapter 5!!!! Summary of the main findings The present thesis investigated the sensory representation of natural sounds in the human auditory cortex. Specifically,

More information

The role of affordances in inhibition of return

The role of affordances in inhibition of return Psychonomic Bulletin & Review 2006, 13 (6), 1085-1090 The role of affordances in inhibition of return LUCIA RIGGIO, ILARIA PATTERI, ANNALISA OPPO, and GIOVANNI BUCCINO Universitd di Parma, Farina, Italy

More information

The minimal bodily Self - Behavioural and neuroscientific evidence

The minimal bodily Self - Behavioural and neuroscientific evidence Mirror Neurons, Embodied Empathy and the Boundaries of the Self London, 10-11 January 2014 The minimal bodily Self - Behavioural and neuroscientific evidence Vittorio Gallese Dept. of Neuroscience University

More information

Neuroscience Tutorial

Neuroscience Tutorial Neuroscience Tutorial Brain Organization : cortex, basal ganglia, limbic lobe : thalamus, hypothal., pituitary gland : medulla oblongata, midbrain, pons, cerebellum Cortical Organization Cortical Organization

More information

Mirror Neurons Responding to Observation of Actions Made with Tools in Monkey Ventral Premotor Cortex

Mirror Neurons Responding to Observation of Actions Made with Tools in Monkey Ventral Premotor Cortex Mirror Neurons Responding to Observation of Actions Made with Tools in Monkey Ventral Premotor Cortex Pier Francesco Ferrari, Stefano Rozzi, and Leonardo Fogassi Abstract & In the present study, we describe

More information

Peripersonal Space and Body Schema: Two Labels for the Same Concept?

Peripersonal Space and Body Schema: Two Labels for the Same Concept? Brain Topogr (2009) 21:252 260 DOI 10.1007/s10548-009-0092-7 ORIGINAL PAPER Peripersonal Space and Body Schema: Two Labels for the Same Concept? Lucilla Cardinali Æ Claudio Brozzoli Æ Alessandro Farnè

More information

Social Cognition and the Mirror Neuron System of the Brain

Social Cognition and the Mirror Neuron System of the Brain Motivating Questions Social Cognition and the Mirror Neuron System of the Brain Jaime A. Pineda, Ph.D. Cognitive Neuroscience Laboratory COGS1 class How do our brains perceive the mental states of others

More information

P. Hitchcock, Ph.D. Department of Cell and Developmental Biology Kellogg Eye Center. Wednesday, 16 March 2009, 1:00p.m. 2:00p.m.

P. Hitchcock, Ph.D. Department of Cell and Developmental Biology Kellogg Eye Center. Wednesday, 16 March 2009, 1:00p.m. 2:00p.m. Normal CNS, Special Senses, Head and Neck TOPIC: CEREBRAL HEMISPHERES FACULTY: LECTURE: READING: P. Hitchcock, Ph.D. Department of Cell and Developmental Biology Kellogg Eye Center Wednesday, 16 March

More information

Key questions about attention

Key questions about attention Key questions about attention How does attention affect behavioral performance? Can attention affect the appearance of things? How does spatial and feature-based attention affect neuronal responses in

More information

Pain is more than an unpleasant feeling associated with a somatosensory sensation.

Pain is more than an unpleasant feeling associated with a somatosensory sensation. Where is my pain? Pain is more than an unpleasant feeling associated with a somatosensory sensation. It is an important signal that prompts identification, localization, and reaction to a potential physical

More information

FINAL PROGRESS REPORT

FINAL PROGRESS REPORT (1) Foreword (optional) (2) Table of Contents (if report is more than 10 pages) (3) List of Appendixes, Illustrations and Tables (if applicable) (4) Statement of the problem studied FINAL PROGRESS REPORT

More information

Agency, Subjective Time, and Other Minds

Agency, Subjective Time, and Other Minds Journal of Experimental Psychology: Human Perception and Performance 2007, Vol. 33, No. 6, 1261 1268 Copyright 2007 by the American Psychological Association 0096-1523/07/$12.00 DOI: 10.1037/0096-1523.33.6.1261

More information

Selective Attention. Inattentional blindness [demo] Cocktail party phenomenon William James definition

Selective Attention. Inattentional blindness [demo] Cocktail party phenomenon William James definition Selective Attention Inattentional blindness [demo] Cocktail party phenomenon William James definition Everyone knows what attention is. It is the taking possession of the mind, in clear and vivid form,

More information

Compressing Perceived Distance With Remote Tool-Use: Real, Imagined, and Remembered

Compressing Perceived Distance With Remote Tool-Use: Real, Imagined, and Remembered Journal of Experimental Psychology: Human Perception and Performance 2011, Vol., No., 000 000 2011 American Psychological Association 0096-1523/11/$12.00 DOI: 10.1037/a0024981 Compressing Perceived Distance

More information

Report. View-Based Encoding of Actions in Mirror Neurons of Area F5 in Macaque Premotor Cortex

Report. View-Based Encoding of Actions in Mirror Neurons of Area F5 in Macaque Premotor Cortex Current Biology 21, 144 148, January 25, 2011 ª2011 Elsevier Ltd All rights reserved DOI 10.1016/j.cub.2010.12.022 View-Based Encoding of Actions in Mirror Neurons of Area F5 in Macaque Premotor Cortex

More information

The Integration of Features in Visual Awareness : The Binding Problem. By Andrew Laguna, S.J.

The Integration of Features in Visual Awareness : The Binding Problem. By Andrew Laguna, S.J. The Integration of Features in Visual Awareness : The Binding Problem By Andrew Laguna, S.J. Outline I. Introduction II. The Visual System III. What is the Binding Problem? IV. Possible Theoretical Solutions

More information

Cognitive Neuroscience Attention

Cognitive Neuroscience Attention Cognitive Neuroscience Attention There are many aspects to attention. It can be controlled. It can be focused on a particular sensory modality or item. It can be divided. It can set a perceptual system.

More information

Conscious control of movements: increase of temporal precision in voluntarily delayed actions

Conscious control of movements: increase of temporal precision in voluntarily delayed actions Acta Neurobiol. Exp. 2001, 61: 175-179 Conscious control of movements: increase of temporal precision in voluntarily delayed actions El bieta Szel¹g 1, Krystyna Rymarczyk 1 and Ernst Pöppel 2 1 Department

More information

Cortical Control of Movement

Cortical Control of Movement Strick Lecture 2 March 24, 2006 Page 1 Cortical Control of Movement Four parts of this lecture: I) Anatomical Framework, II) Physiological Framework, III) Primary Motor Cortex Function and IV) Premotor

More information

Identify these objects

Identify these objects Pattern Recognition The Amazing Flexibility of Human PR. What is PR and What Problems does it Solve? Three Heuristic Distinctions for Understanding PR. Top-down vs. Bottom-up Processing. Semantic Priming.

More information

Cognitive Modelling Themes in Neural Computation. Tom Hartley

Cognitive Modelling Themes in Neural Computation. Tom Hartley Cognitive Modelling Themes in Neural Computation Tom Hartley t.hartley@psychology.york.ac.uk Typical Model Neuron x i w ij x j =f(σw ij x j ) w jk x k McCulloch & Pitts (1943), Rosenblatt (1957) Net input:

More information

Mirror Neurons in Primates, Humans, and Implications for Neuropsychiatric Disorders

Mirror Neurons in Primates, Humans, and Implications for Neuropsychiatric Disorders Mirror Neurons in Primates, Humans, and Implications for Neuropsychiatric Disorders Fiza Singh, M.D. H.S. Assistant Clinical Professor of Psychiatry UCSD School of Medicine VA San Diego Healthcare System

More information

Note: Waxman is very sketchy on today s pathways and nonexistent on the Trigeminal.

Note: Waxman is very sketchy on today s pathways and nonexistent on the Trigeminal. Dental Neuroanatomy Thursday, February 3, 2011 Suzanne Stensaas, PhD Note: Waxman is very sketchy on today s pathways and nonexistent on the Trigeminal. Resources: Pathway Quiz for HyperBrain Ch. 5 and

More information

Lecture 35 Association Cortices and Hemispheric Asymmetries -- M. Goldberg

Lecture 35 Association Cortices and Hemispheric Asymmetries -- M. Goldberg Lecture 35 Association Cortices and Hemispheric Asymmetries -- M. Goldberg The concept that different parts of the brain did different things started with Spurzheim and Gall, whose phrenology became quite

More information

To point a finger: Attentional and motor consequences of observing pointing movements

To point a finger: Attentional and motor consequences of observing pointing movements Available online at www.sciencedirect.com Acta Psychologica 128 (2008) 56 62 www.elsevier.com/locate/actpsy To point a finger: Attentional and motor consequences of observing pointing movements Artem V.

More information

Age-differentiated analysis of the hand proximity effect in a visual search paradigm

Age-differentiated analysis of the hand proximity effect in a visual search paradigm Age-differentiated analysis of the hand proximity effect in a visual search paradigm C. Bröhl, S. Antons, J. Bützler, C.M. Schlick Chair and Institute of Industrial Engineering and Ergonomics of RWTH Aachen

More information

Frank Tong. Department of Psychology Green Hall Princeton University Princeton, NJ 08544

Frank Tong. Department of Psychology Green Hall Princeton University Princeton, NJ 08544 Frank Tong Department of Psychology Green Hall Princeton University Princeton, NJ 08544 Office: Room 3-N-2B Telephone: 609-258-2652 Fax: 609-258-1113 Email: ftong@princeton.edu Graduate School Applicants

More information

Remembering the Past to Imagine the Future: A Cognitive Neuroscience Perspective

Remembering the Past to Imagine the Future: A Cognitive Neuroscience Perspective MILITARY PSYCHOLOGY, 21:(Suppl. 1)S108 S112, 2009 Copyright Taylor & Francis Group, LLC ISSN: 0899-5605 print / 1532-7876 online DOI: 10.1080/08995600802554748 Remembering the Past to Imagine the Future:

More information

On the Time of Peripheral Sensations and Voluntary Motor Actions. Text

On the Time of Peripheral Sensations and Voluntary Motor Actions. Text On the Time of Peripheral Sensations and Voluntary Motor Actions DOUGLAS M. SNYDER ABSTRACT Libet's notions of backwards referral for peripheral sensations and unconscious cerebral initiative accompanying

More information

The Frontal Lobes. Anatomy of the Frontal Lobes. Anatomy of the Frontal Lobes 3/2/2011. Portrait: Losing Frontal-Lobe Functions. Readings: KW Ch.

The Frontal Lobes. Anatomy of the Frontal Lobes. Anatomy of the Frontal Lobes 3/2/2011. Portrait: Losing Frontal-Lobe Functions. Readings: KW Ch. The Frontal Lobes Readings: KW Ch. 16 Portrait: Losing Frontal-Lobe Functions E.L. Highly organized college professor Became disorganized, showed little emotion, and began to miss deadlines Scores on intelligence

More information

TMS Disruption of Time Encoding in Human Primary Visual Cortex Molly Bryan Beauchamp Lab

TMS Disruption of Time Encoding in Human Primary Visual Cortex Molly Bryan Beauchamp Lab TMS Disruption of Time Encoding in Human Primary Visual Cortex Molly Bryan Beauchamp Lab This report details my summer research project for the REU Theoretical and Computational Neuroscience program as

More information

Motor Functions of Cerebral Cortex

Motor Functions of Cerebral Cortex Motor Functions of Cerebral Cortex I: To list the functions of different cortical laminae II: To describe the four motor areas of the cerebral cortex. III: To discuss the functions and dysfunctions of

More information

Ch 5. Perception and Encoding

Ch 5. Perception and Encoding Ch 5. Perception and Encoding Cognitive Neuroscience: The Biology of the Mind, 2 nd Ed., M. S. Gazzaniga,, R. B. Ivry,, and G. R. Mangun,, Norton, 2002. Summarized by Y.-J. Park, M.-H. Kim, and B.-T. Zhang

More information

Supplemental Table

Supplemental Table Supplemental Table 1. T-matrix for the interaction between task (pantomiming > picture matching) and epoch (stimulus-driven functional connectivity > pre-stimulus functional connectivity) using functional

More information

A computational model of cooperative spatial behaviour for virtual humans

A computational model of cooperative spatial behaviour for virtual humans A computational model of cooperative spatial behaviour for virtual humans Nhung Nguyen and Ipke Wachsmuth Abstract This chapter introduces a model which connects representations of the space surrounding

More information

1) Drop off in the Bi 150 box outside Baxter 331 or to the head TA (jcolas).

1) Drop off in the Bi 150 box outside Baxter 331 or  to the head TA (jcolas). Bi/CNS/NB 150 Problem Set 5 Due: Tuesday, Nov. 24, at 4:30 pm Instructions: 1) Drop off in the Bi 150 box outside Baxter 331 or e-mail to the head TA (jcolas). 2) Submit with this cover page. 3) Use a

More information

Phil 490: Consciousness and the Self Handout [16] Jesse Prinz: Mental Pointing Phenomenal Knowledge Without Concepts

Phil 490: Consciousness and the Self Handout [16] Jesse Prinz: Mental Pointing Phenomenal Knowledge Without Concepts Phil 490: Consciousness and the Self Handout [16] Jesse Prinz: Mental Pointing Phenomenal Knowledge Without Concepts Main Goals of this Paper: Professor JeeLoo Liu 1. To present an account of phenomenal

More information

A Dynamic Model for Action Understanding and Goal-Directed Imitation

A Dynamic Model for Action Understanding and Goal-Directed Imitation * Manuscript-title pg, abst, fig lgnd, refs, tbls Brain Research 1083 (2006) pp.174-188 A Dynamic Model for Action Understanding and Goal-Directed Imitation Wolfram Erlhagen 1, Albert Mukovskiy 1, Estela

More information

LAB 4 BALANCE, PERCEPTUAL JUDGMENT, AND FINE MOTOR SKILLS DEVELOPMENT

LAB 4 BALANCE, PERCEPTUAL JUDGMENT, AND FINE MOTOR SKILLS DEVELOPMENT Introduction LAB 4 BALANCE, PERCEPTUAL JUDGMENT, AND FINE MOTOR SKILLS DEVELOPMENT This lab consists of a series of experiments that explore various perceptual skills that help us understand how we perform

More information

shows syntax in his language. has a large neocortex, which explains his language abilities. shows remarkable cognitive abilities. all of the above.

shows syntax in his language. has a large neocortex, which explains his language abilities. shows remarkable cognitive abilities. all of the above. Section: Chapter 14: Multiple Choice 1. Alex the parrot: pp.529-530 shows syntax in his language. has a large neocortex, which explains his language abilities. shows remarkable cognitive abilities. all

More information

FRONTAL LOBE. Central Sulcus. Ascending ramus of the Cingulate Sulcus. Cingulate Sulcus. Lateral Sulcus

FRONTAL LOBE. Central Sulcus. Ascending ramus of the Cingulate Sulcus. Cingulate Sulcus. Lateral Sulcus FRONTAL LOBE Central Ascending ramus of the Cingulate Cingulate Lateral Lateral View Medial View Motor execution and higher cognitive functions (e.g., language production, impulse inhibition, reasoning

More information

UNDERSTANDING THE CONCEPTS PERIPERSONAL SPACE, BODY SCHEMA AND BODY IMAGE. Bachelor Degree Project in Cognitive Neuroscience 15 ECTS Spring term 2012

UNDERSTANDING THE CONCEPTS PERIPERSONAL SPACE, BODY SCHEMA AND BODY IMAGE. Bachelor Degree Project in Cognitive Neuroscience 15 ECTS Spring term 2012 UNDERSTANDING THE CONCEPTS PERIPERSONAL SPACE, BODY SCHEMA AND BODY IMAGE Bachelor Degree Project in Cognitive Neuroscience 15 ECTS Spring term 2012 Magnus Hübsch Supervisor: Paavo Pylkkänen Examiner:

More information

Ch 5. Perception and Encoding

Ch 5. Perception and Encoding Ch 5. Perception and Encoding Cognitive Neuroscience: The Biology of the Mind, 2 nd Ed., M. S. Gazzaniga, R. B. Ivry, and G. R. Mangun, Norton, 2002. Summarized by Y.-J. Park, M.-H. Kim, and B.-T. Zhang

More information

Cerebral Cortex: Association Areas and Memory Tutis Vilis

Cerebral Cortex: Association Areas and Memory Tutis Vilis 97 Cerebral Cortex: Association Areas and Memory Tutis Vilis a) Name the 5 main subdivisions of the cerebral cortex. Frontal, temporal, occipital, parietal, and limbic (on the medial side) b) Locate the

More information

Arnold Trehub and Related Researchers 3D/4D Theatre in the Parietal Lobe (excerpt from Culture of Quaternions Presentation: Work in Progress)

Arnold Trehub and Related Researchers 3D/4D Theatre in the Parietal Lobe (excerpt from Culture of Quaternions Presentation: Work in Progress) Arnold Trehub and Related Researchers 3D/4D Theatre in the Parietal Lobe (excerpt from Culture of Quaternions Presentation: Work in Progress) 3D General Cognition Models 3D Virtual Retinoid Space with

More information

Thalamus and Sensory Functions of Cerebral Cortex

Thalamus and Sensory Functions of Cerebral Cortex Thalamus and Sensory Functions of Cerebral Cortex I: To describe the functional divisions of thalamus. II: To state the functions of thalamus and the thalamic syndrome. III: To define the somatic sensory

More information

Visual Context Dan O Shea Prof. Fei Fei Li, COS 598B

Visual Context Dan O Shea Prof. Fei Fei Li, COS 598B Visual Context Dan O Shea Prof. Fei Fei Li, COS 598B Cortical Analysis of Visual Context Moshe Bar, Elissa Aminoff. 2003. Neuron, Volume 38, Issue 2, Pages 347 358. Visual objects in context Moshe Bar.

More information

Retinotopy & Phase Mapping

Retinotopy & Phase Mapping Retinotopy & Phase Mapping Fani Deligianni B. A. Wandell, et al. Visual Field Maps in Human Cortex, Neuron, 56(2):366-383, 2007 Retinotopy Visual Cortex organised in visual field maps: Nearby neurons have

More information

Cortical Visual Symptoms

Cortical Visual Symptoms 대한안신경의학회지 : 제 6 권 Supplement 2 ISSN: 2234-0971 Jeong-Yoon Choi Department of Neurology, Seoul National University Bundang Hospital, Seongnam, Korea Jeong-Yoon Choi. MD. PhD. Department of Neurology, Seoul

More information

Carnegie Mellon University Annual Progress Report: 2011 Formula Grant

Carnegie Mellon University Annual Progress Report: 2011 Formula Grant Carnegie Mellon University Annual Progress Report: 2011 Formula Grant Reporting Period January 1, 2012 June 30, 2012 Formula Grant Overview The Carnegie Mellon University received $943,032 in formula funds

More information

and the surrounding annuli. Effects on grasp scaling be-

and the surrounding annuli. Effects on grasp scaling be- Brief Communication 177 The dissociation between perception and action in the Ebbinghaus illusion: Nonillusory effects of pictorial cues on grasp Angela M. Haffenden, Karen C. Schiff and Melvyn A. Goodale

More information

The Simon Effect as a Function of Temporal Overlap between Relevant and Irrelevant

The Simon Effect as a Function of Temporal Overlap between Relevant and Irrelevant University of North Florida UNF Digital Commons All Volumes (2001-2008) The Osprey Journal of Ideas and Inquiry 2008 The Simon Effect as a Function of Temporal Overlap between Relevant and Irrelevant Leslie

More information

Enhanced visual perception near the hands

Enhanced visual perception near the hands Enhanced visual perception near the hands Bachelor thesis Marina Meinert (s0163430) Supervisors: 1 st supervisor: Prof. Dr. Ing. W. B. Verwey 2 nd supervisor: Dr. M. L. Noordzij External supervisor: Dr.

More information

Sensorimotor Functioning. Sensory and Motor Systems. Functional Anatomy of Brain- Behavioral Relationships

Sensorimotor Functioning. Sensory and Motor Systems. Functional Anatomy of Brain- Behavioral Relationships Sensorimotor Functioning Sensory and Motor Systems Understanding brain-behavior relationships requires knowledge of sensory and motor systems. Sensory System = Input Neural Processing Motor System = Output

More information

Physiology of Tactile Sensation

Physiology of Tactile Sensation Physiology of Tactile Sensation Objectives: 1. Describe the general structural features of tactile sensory receptors how are first order nerve fibers specialized to receive tactile stimuli? 2. Understand

More information

Summary. Multiple Body Representations 11/6/2016. Visual Processing of Bodies. The Body is:

Summary. Multiple Body Representations 11/6/2016. Visual Processing of Bodies. The Body is: Visual Processing of Bodies Corps et cognition: l'embodiment Corrado Corradi-Dell Acqua corrado.corradi@unige.ch Theory of Pain Laboratory Summary Visual Processing of Bodies Category-specificity in ventral

More information

The Parietal Lobe as a Sensorimotor Interface: A Perspective from Clinical and Neuroimaging Data

The Parietal Lobe as a Sensorimotor Interface: A Perspective from Clinical and Neuroimaging Data NeuroImage 14, S142 S146 (2001) doi:10.1006/nimg.2001.0863, available online at http://www.idealibrary.com on The Parietal Lobe as a Sensorimotor Interface: A Perspective from Clinical and Neuroimaging

More information

The origins of localization

The origins of localization Association Cortex, Asymmetries, and Cortical Localization of Affective and Cognitive Functions Michael E. Goldberg, M.D. The origins of localization The concept that different parts of the brain did different

More information