Institute of Psychology C.N.R. - Rome

Size: px
Start display at page:

Download "Institute of Psychology C.N.R. - Rome"

Transcription

1 Institute of Psychology C.N.R. - Rome Proximodistal sensory evolution in a population of artificial neural networks Matthew Schlesinger Domenico Parisi Institute of Psychology, National Research Council Viale Marx, Rome, Italy phone: fax: matthew@gracco.irmkant.rm.cnr.it May 1997 Technical Report

2 Proximodistal sensory evolution in a population of artificial neural networks Matthew Schlesinger Domenico Parisi Institute of Psychology, National Research Council Viale Marx, Rome, Italy phone: fax: matthew@gracco.irmkant.rm.cnr.it Abstract The evolution of object-directed reaching in a population of artificial-life organisms was investigated. The capacity to discriminate between two objects emerged as organisms evolved the ability to reach. Discrimination of proximal sensory signals (i.e., by touch) evolved before discrimination of distal signals (i.e., by sight). However, organisms with the capacity to discriminate by sight were ultimately the most successful at selecting and reaching for an object which increased fitness. Organisms with both proximal and distal sensory systems evolved at an intermediate rate, due to the added computational task of integrating tactile and visual discrimination. In addition, the rate of visual discrimination evolution was modestly accelerated by providing a non-differential contact signal. 1. Introduction. Traditional approaches to the study of object-directed reaching include research with human infants and adults, as well as animals (e.g., von Hofsten and Ronnqvist, 1993; Hartje and Ettlinger, 1971; Pelisson, Prablanc, Goodale, and Jeannerod, 1986). This work is now complemented by the emergence of several recent computational models which simulate the emergence of object-directed reaching. Each of these computational models of object-directed reaching has a different objective. First, some models focus on anatomical or physiological characteristics of the motor system (e.g., Berthier, Singh, Barto, & Houk, 1993; Kettner, Marcario, & Port, 1993; Sporns & Edelman, 1993). Other models study the dynamic or kinematic properties of reaching actions in humans (Berthier, 1995; Rosenbaum, Loukopoulos, Meulenbroek, Vaughan, & Engelbrecht, 1995). In addition, some models emphasize the goal-directed characteristics of reaching behavior, as an early form of planned action (Bullock & Grossberg, 1988; Jeannerod, 1990). Finally, artificial-life models have studied the emergence of objectdirected reaching within the framework of adaptive behavior (Cecconi and Parisi, 1990; Di Sano, 1993). The artificial-life model is particularly well-suited for studying the evolution of sensorimotor behavior (Parisi, Cecconi, and Nolfi, 1990). This model employs the agent-based approach, in which a population of autonomous agents--who each possess a genetic code, a simple nervous system, and a physical body--evolve the capacity to perform a desired sensorimotor task. This approach has been applied to the study of object-directed reaching in a population of artificial neural networks (Cecconi and Parisi, 1990; Di Sano, 1993). 2

3 2. Related work and new questions. Prior models of object-directed reaching in artificial-life organisms have focused on the acquisition of sensorimotor coordination between visual and reaching systems. For example, Di Sano (1993) studied a population of artificial organisms that evolved the capacity to reach for and maintain contact with a nearby object. Cecconi and Parisi (1990) studied the evolution of reaching in a similar population of organisms which could locomote, as well as move their eye and arm. These models employ a relatively simple environment, with only a single type of object available for reaching. Consequently, all objects in this simplified environment are functionally identical, and have no perceptual features apart from the spatial positions they occupy. However, recent studies have begun to simulate more complex environments. These environments include different types of objects, which when acted upon have different fitness consequences. For instance, Cangelosi and Parisi (1996) studied a population of organisms that evolved the capacity to discriminate between edible and poisonous food elements. Di Sano (1993) studied the evolution of object-directed reaching in a one-object environment. Organisms in this environment increased their fitness by touching the goal object. One can imagine a more complex two-object environment, where, in addition to an object which increases fitness, there is also a second object which lowers fitness each time it is touched. The task for an organism in this environment is twofold: (1) to evolve the capacity to coordinate eye and arm movements in the act of reaching; and (2) to discriminate between the "good" and "bad" objects. The goal of the present simulation is to study how artificial-life organisms will evolve the capacity to discriminate between the good and bad objects in this two-object environment, while acquiring the ability to reach. A specific focus is on the evolution of tactile versus visual discrimination. Consequently, a number of different organisms, each with a different combination of sensory systems, were compared. 2.1 Prediction 1: Rate of evolution. In a two-object environment, organisms must evolve the capacity to discriminate between objects with different fitness outcomes (i.e., good versus bad objects). In order to be differentiable, good and bad objects must appear different within at least one sensory dimension (e.g., visual, tactile, auditory, olfactory, etc.). From the organism s perspective, there is an important distinction between the distributions of sensory signals within each sensory dimension or modality. For example, proximal sensory signals are those which are only available when the organism is near to, or in contact with, an object (e.g., a tactile signal), while distal sensory signals are available across a relatively greater range of distances (e.g., a visual signal). Because proximal and distal signals are distributed differently, one may ask how these two types of signals are related during evolution. For example, does discrimination emerge simultaneously across all sensory modalities, or sooner in one modality than in another? The present simulation focuses on touch and vision, as proximal and distal sensory systems. Four types of organisms are compared: (1) organisms which only receive a tactile sensory signal; (2) organisms which only receive a visual signal; (3) organisms which receive both tactile and visual signals. In addition, a fourth organism is also studied, which does not receive any sensory information about the two types of objects. This organism serves as a baseline condition, in which object-directed reaching evolves without the benefit of a objectdiscrimination signal. 3

4 The first prediction is that discrimination of proximal sensory signals will emerge before discrimination of distal signals. Specifically, discrimination by touch will evolve more rapidly than discrimination by sight. This prediction is supported by several observations. First, proximal and distal sensory signals are associated differently with changes in an organism s fitness. A consummatory act is defined as an action which directly changes fitness (e.g., touching the good object). Preparatory acts, meanwhile, are actions which proceed and make possible the consummatory act (e.g., avoiding the bad object, or approaching the good object). They have only an indirect effect on fitness. By definition, tactile (proximal) sensory signals are only available during the consummatory act, and thus are directly associated with changes in fitness. In contrast, visual (distal) sensory signals are available during both preparatory and consummatory acts, and thus have a weaker association with fitness. Second, proximal sensory signals are associated with more direct sensory-motor mappings than distal sensory signals. Figure 1 presents the mapping from active tactile and visual sensory signals, to their respective motor actions. The tactile input signal maps to only one of two possible (consummatory) actions, depending on which object is being contacted. The visual input signal, meanwhile, maps to one of four possible (consummatory or preparatory) actions, as a function of which object is seen and whether the hand is touching it. Because the tactile input signal maps to fewer possible motor actions than the visual input signal, it is therefore expected that the tactile signal will be associated with its respective actions sooner than the visual signal. 2.2 Prediction 2: Extent of evolution. It is predicted that touch will emerge before sight in the evolution of object discrimination. However, touch is a relatively inefficient sensory mode for object discrimination, because it requires the organism to both approach and contact an object before it can be identified as good or bad. Vision, by comparison, is a rapid and efficient mode for object discrimination: by definition, objects can be identified as soon as they are located visually, without the additional requirement of approach and contact. Therefore, the second prediction is that although discrimination by sight will evolve more slowly than discrimination by touch, vision will eventually lead to higher levels of discrimination, and consequently, higher fitness levels, for organisms who can discriminate by sight. 3. The present model. The simulation can be described at several levels. First, an initial generation of artificial organisms is created, each with one or more sensory systems (for locating and identifying objects), a motor system (an eye and an arm), as well as a neural network for transforming sensory signals into motor movements. Second, the environment is constructed with two kinds of objects that can be seen and/or touched. Third, on the timescale of the individual, each organism is given a number of chances to reach for nearby objects. Finally, on the inter-generational timescale, the best individuals from each generation are selected to produce offspring, which comprise the subsequent generation. 4

5 Tactile ON Visual ON GOOD BAD GOOD BAD STAY LEAVE CONTACT GAP CONTACT GAP STAY APPROAC LEAVE AVOID H Figure 1. Schematic diagram of the mapping from active tactile and visual sensory input signals to their appropriate motor outputs. 3.1 The organism. In order to simulate the behavior of object-directed reaching, the general structure of Di Sano s (1993) model was employed. A representation of the organism and its environment is presented in Figure 2. Like Di Sano s model, the organism possesses a 60-degree visual field, and a two-segment arm. Each segment of the arm is 22 units long, and rotates freely. The environment is 100x100 square units. However, the target objects are always located within reach of the organism. Figure 3 presents a schematic display of the organism s neural network structure. The configuration is a standard, 3-layer feed-forward network, with 8 input units, 10 hidden-layer units (as employed by Di Sano), and 3 output units. Good Object Visual Field Arm Bad Object Figure 2. The organism and its environment. The organism has a 60-degree visual field, and a twosegment arm. The two objects are always placed in opposite quadrants. 5

6 Output Eye Forearm Upperarm Input Hand Object Vision Touch (WHERE) (WHAT) Figure 3. The organism s neural net structure. The input units encode the position of the hand and the object ( WHERE ), as well as the type of object ( WHAT ), during visual or tactile contact, while the output units encode movement of the eye and arm segments Object perception is divided into two input streams: one input vector corresponds to the position of the hand and object, respectively (i.e., WHERE signals), while a second vector corresponds to the perceptual qualities of an object which is in contact or sight (i.e., WHAT signals). In the present model, the hand is assumed to have no adaptive perceptual features, and thus is not encoded by the "WHAT" system. The division of information streams encoding object location and object properties is supported by neurophysiological data (Mishkin, Ungerleider, and Macko, 1983). In addition, connectionist models which have employed this division suggest that it provides a number of computational advantages (Jacobs, Jordan, and Barto, 1991). As in Di Sano s model, the first four input units of the network signal the angle (relative to its position within the visual field) and the distance of the hand and target object, when each is within the visual field. The second four input units signal, respectively, the type of object that is contacted through either the visual or tactile modalities. The three output units encode the amount of change in the position of the visual field, the upperarm, and the forearm, respectively. When the hand enters the visual field, the first input unit signals the angle of the hand, relative to the position of the organism s visual field (i.e., from 0 to 60 degrees). This value is normalized from 0 to 1, with 0 corresponding to the left edge of the visual field, and 1 to the right edge. Thus, when the hand is in the center of the visual field, the input is 0.5. The second input unit, meanwhile, encodes the euclidean distance of the hand from the organism. The third and fourth units encode the angle and position of the object, in a similar manner. When either the hand or object is outside the visual field, the respective units activations are zero. Using this method of encoding, the input state of <0, 0> corresponds to two possible conditions. Either the hand or object, respectively, is located: (1) anywhere outside the visual field, or (2) at the left edge of the visual field at a distance of 0 units from the organism. In this second condition, the hand or object is effectively located inside the organism (i.e., at the exact center of the environment). However, this position is encoded as if the hand or object were outside the visual field, making it invisible until the organism moves it to a new position. 6

7 Movements of the organism are determined as follows. After the spreading of activation of the input pattern, the three output values are used to update the position of the visual field and the two arm segments. In each case, the output value (which is between 0 and 1) is scaled linearly such that 0 represents a 15-degree rotation to the left, 1 a 15-degree rotation to the right, and 0.5 no movement. Intermediate values correspond to relative amounts of rotation in the appropriate direction. Both the visual field, as well as both segments of the arm, are unconstrained in their motion, and thus can rotate in either direction through a complete 360 degree path. 3.2 The objects. The two objects are positioned randomly in the environment, always within the organism s reach. Because the input system can only signal the position of one object at a time, the two objects are always located in diagonally opposite quadrants (though their exact position in the quadrant is random). Each object occupies a space of 5x5 units. In a naturalistic ecology, there are several differences between visual and tactile sensory signals, including: (1) the spatiotemporal availability or distribution of each signal (e.g., proximal versus distal signals); (2) the complexity of information carried within each modality (e.g., more complexity in vision); (3) the motor patterns associated with each type of signal (e.g., visual versus tactile scanning ); and (4) the temporal organization of the sensory input signals (e.g., parallel versus serial input streams). In order to focus on the effects of availability, touch and sight were equated with respect to all other differences. Thus, each object projects a visual signal and a tactile signal which differ only in their availability. The input vector for the good object is <1, 0> (i.e., 1 on the first WHAT unit, and 0 on the second), and <0, 1> for the bad object. The encoding of visual and tactile sensory signals in an equivalent fashion implies an unusual ecology, in which the properties of objects are potentially perceived--both by touch and sight--in an immediate and direct manner. This feature of the present model raises the question, is there a realistic context which corresponds to this type of ecology? Indeed, there are several possible examples. For instance, objects which vary in temperature may have strongly correlated tactile and visual features (e.g., dangerously hot objects glow red, give off steam, etc.), which can be perceived in a similar manner. Alternatively, the perception of various object textures (e.g. a polished or rough surface) is also consistent with this type of tactile and visual perception. In each of these cases, while the ostensible phenomenological characteristics within each sensory modality may differ (i.e., the "qualia" of seeing versus feeling a rough object), their perception across both vision and touch may be comparably simple and direct. The visual and tactile signals are available in accordance with the physical constraints operating within each modality. Thus, whenever an object is within the visual field, the appropriate visual WHAT input signal is available. Similarly, when the hand touches an object, the tactile WHAT input signal is available. Otherwise, when no object is seen or touched, respectively, the visual or tactile WHAT units receive <0, 0> as input. 3.3 The genetic algorithm. In the first generation, 100 individuals, with random connection weights (ranging between 1 and - 1) are created. The arm and visual field of each organism, as well as the two objects, are randomly positioned at the start of each epoch. For each epoch, the organism is then given 150 input-output movement cycles to reach the good object and maintain contact with it. For each cycle during which the 7

8 organism touches the good object, its fitness is increased by 1 unit. Similarly, fitness is reduced by 1 unit for each cycle spent touching the bad object. The fitness formula "rewards" organisms for reaching the good object as quickly as possible, and then maintaining contact with it. This formula was chosen in order to evolve reaching behaviors which create the possibility of interaction with the object (e.g., exploration). Why were organisms not selected solely on the basis of their time of first contact (e.g., latency)? Kinematic studies of reaching development in human infants suggests that faster reaches are not associated with acquiring the object, but rather with striking or batting it (e.g., Berthier, 1995). However, organisms normally reach for objects in order to do something with them (e.g., eat them, visually inspect them, etc.). Thus, our fitness formula was employed in order to focus on the type of reaching that normally proceeds acquisition and manipulation of the goal object. The reaching behavior of each individual is tested for 50 epochs, with fitness accumulating across each epoch. After all individuals in a generation are tested, they are sorted according to their total fitness scores. The 20 individuals with the highest fitness scores (i.e., most time spent on the good object minus time on the bad object) then generate 5 offspring each. For each offspring, a random 10% of the connection weights are mutated, by adding a random value between 1 and -1 to each weight selected for mutation. This process is repeated iteratively for 500 generations. 3.5 Comparing the evolutionary roles of vision and touch. Four populations were studied: Both, See, Touch, and None. In the Both population, both the tactile and visual what sensory channels function normally. When an object is touched or seen, respectively, the appropriate input value is made available to the organism. In the See population, the visual channel functions normally, while the tactile input is always <0, 0>. Similarly, in the Touch population, the tactile channel functions normally while the visual input is always <0, 0>. Finally, both the tactile and visual what channels are <0, 0> in the None population. For each population of organisms, 10 random seeds (i.e., initial random conditions) were tested. For each seed, the random conditions were identical across the four populations (e.g, connection weights during the first generation, initial positions of the organism and objects in each epoch, number and location of mutations, etc.). 4. Results The evolution of object-directed reaching in each of the populations is compared in two analyses. In the first analysis, the four populations are compared with respect to their overall level of fitness. In the second, the global behavior of reaching is decomposed into the component behaviors of (1) choosing the correct object, and (2) reaching and maintaining contact with it. 4.1 Comparison of Fitness. Figure 4 presents the average fitness for the four populations, for 500 generations of evolution. For each generation, the average of the top 20 individuals (i.e., those selected to reproduce) is presented. The theoretical limit for fitness is 7500, but this value is not possible in practice, given the fact that the organism does not begin each epoch with its hand already positioned on the good object. During the first 25 generations, as Figure 4 indicates, the Touch, See, and Both populations remain approximately equal in fitness. However, the See population subsequently becomes the most successful, followed by the Both, Touch, and None populations. None of the four populations achieve maximum 8

9 Average Fitness as a Function of Population 6000 average fitness None Touch See Both generation Figure 4. Average fitness (across 10 replications) of the top 20 individuals in each generation, as a function of the population type. fitness. By the 500th generation, the See, Both, and Touch populations accumulate an average of 69%, 59%, and 34% respectively, of the total possible fitness Component-Behavior Analyses. Object-directed reaching can be divided into several components, including the motor problem of minimizing the distance between the hand and object, and the perceptual problem of choosing the correct object to contact. First, reaching is separated into the acts of making contact with an object, and then maintaining contact. Second, discrimination is divided into overall discrimination, and, when possible, visual discrimination. Latency. Latency can be defined as the first touch to the good object. This measure, however, favors See and Both organisms, who possess a functional visual discrimination system. In order to eliminate this perceptual advantage, therefore, we defined latency as the first touch to any object. Figure 5 presents the average latencies for the top 20 individuals in each generation, across the four populations. Unlike Figure 4, which is scaled to the lifetime of an individual, Figure 5 (and all subsequent figures) is scaled to represent the average for a single epoch (i.e., 150 movement cycles). Overall, latency evolves at nearly the same rate in the Touch, See, and Both populations. While the See population has a slightly faster average latency between the 50th and 300th generations, this difference diminishes during the final 200 generations. Latency also declines in the None population, though at a slower rate than in the other populations. Separation. Separation is defined as the number of times that the hand leaves the good object during an epoch, after reaching it. Figure 6 presents the average proportion of separations, for each 10 touches to the good object, for the top 20 individuals in each generation, as a function of population. 9

10 Average Latency as a Function of Population 150 average latency None Touch See Both generation Figure 5. Average latency (across 10 replications) of the top 20 individuals in each generation, as a function of the population type. (The absolute number of separations was scaled to 10 touches, in order to control for the difference between populations in the ability to successfully reach the good object.) During the first generation, all of the populations average 5 separations for each 10 touches to the good object. While the proportion of separations initially falls rapidly in the Touch and Both populations, the rate declines more slowly in the See population. One explanation for this difference is the fact that Touch and Both organisms possess a tactile sensory system. Thus, because the tactile system is only active during contact, it appears to provide two functions: first, to identify the perceptual properties of the object, and second, to signal contact. However, the "WHERE" system (i.e., the sensory inputs for the position of the hand and object) can also be used to determine whether the hand is in contact with an object (see section 4.3, below). Indeed, in the See population, the proportion of separations gradually declines to less than 1 separation per 10 touches to the good object. This decline can only be attributed to the emergence of contact perception in the "WHERE" system. What happens with Touch and Both organisms, who have two alternative means for perceiving hand-object contact? Figure 6 suggests that although possessing a tactile sensory system initially facilitates a decline in the rate of separations, it may also interfere with the emergence of the "WHERE" system as an alternative system for perceiving contact. Thus, after an initial drop, the rate of separations rises in both the Touch and Both populations. However, the proportion of separations rises farther in the Touch population, and then fails to decline to the same level as in the See and Both populations. Meanwhile, the proportion of separations drops again in the Both population after the initial rise. The See and Both populations evolve nearly in parallel after generation 250. The overall pattern suggests a number of tentative conclusions. First, a tactile sensory system may facilitate the initial emergence of the capacity to maintain contact with the good object, but this initial effect may also delay the emergence of the use of the "WHERE" system for perceiving and maintaining contact. Second, a visual "WHAT" system appears to facilitate the long-term evolution of the "WHERE" system. This conclusion is supported by two observations: (1) the proportion of separations 10

11 10 Average Separation as a Function of Population (Number of Separations per 10 Touches to Good Object) average # of separations None Touch See Both generation Figure 6. Average number of separations (across 10 replications) of the top 20 individuals in each generation, per every 10 touches to the good object, as a function of the population type. eventually declines more quickly in the Both than in the Touch population, which only differ in the possession of the visual sensory system, and (2) the same pattern is found in the See and None populations, respectively, which also differ in the same way. Overall Discrimination. Figure 7 presents the average percent of all touches to the good object (versus the bad object) for the top 20 individuals in each population. Overall discrimination, like fitness, measures the organism s ability to reach for the good object, while avoiding the bad object. However, because fitness is a combined measure of reaching and discrimination, it is biased toward organisms who are better at reaching (e.g., those with shorter latencies). Overall discrimination, meanwhile, minimizes this bias by using the percent of all touches for each organism, rather than the absolute number of touches. Only the results for the first 100 generations are presented, in order to focus on the initial period of evolution. Because of their respective sensory inputs, overall discrimination measures the rate of visual discrimination in the See population, and tactile discrimination in the Touch population, while it is a combined measure of tactile-visual discrimination in the Both population. During the first generations, the Touch and Both populations remain close in their overall ability to discriminate, while the See population evolves more slowly. By the 20th generation, though, the Touch and Both populations separate, as the Both populations begins to converge with the See population. By generation 100, overall discrimination reaches over 95% in the See and Both populations, while it climbs just above 90% in the Touch population. After generation 100, the overall discrimination rates eventually stabilize. The See and Both populations average nearly perfect rates of discrimination, while the Touch population averages approximately 95%. Why is overall discrimination lower in the Touch population? This result is due to the necessary tendency for Touch organisms, who cannot discriminate by sight, to contact the bad object at least once, when they encounter it first during an epoch. 11

12 Average Overall Discrimination as a Function of Population 1 average overall discrimination None Touch See Both generation Figure 7. Average overall discrimination (across 10 replications) of the top 20 individuals in each generation, as a function of the population type. Only the first 100 generations are presented. Visual Discrimination. While overall discrimination includes all touches, visual discrimination is defined as the percent of times that the good object is touched first during each epoch. Like overall discrimination, visual discrimination is also positively associated with fitness. Figure 8 presents the average percent of first touches to the good object for the top 20 individuals in each generation, across the four populations. Three interesting results deserve attention. First, unexpectedly, visual discrimination remains above chance in both the None and Touch populations. This result is surprising, given that neither population has the capacity to discriminate by sight. However, Figure 8 presents average visual discrimination in the top 20 individuals in each population. As expected, when all 100 individuals are included in the average, visual discrimination falls to 50% for both the None and Touch populations. Therefore, in the case of None and Touch organisms, Figure 8 measures the rate of visual discrimination in the 20 individuals who, by chance, tended to reach the good object first more often than the bad object. Second, nevertheless, visual discrimination remains higher in the None population than in the Touch population (57% and 52%, respectively). Are the top 20 None organisms more lucky than the top 20 Touch organisms? The answer to this question is a qualified "yes". One explanation for this difference is that while the Touch population is under selective pressure to evolve the capacity for tactile discrimination, the None population has no discrimination systems and therefore is left relatively free to vary. According to this explanation, one should predict that visual discrimination will be more variable in None than in Touch organisms. In order to test this explanation, the standard deviation of visual discrimination, for each generation of None and Touch individuals, was computed. Across all 500 generations, the standard deviation was 0.10 and 0.08 in the None and Touch populations, respectively. Statistical analysis of these population means reveals that variability in the None population is significantly higher (t(980) = 46.13, p <.00001). 12

13 Average Visual Discrimination as a Function of Population average visual discrimination generation None Touch See Both Figure 8. Average visual discrimination (across 10 replications) of the top 20 individuals in each generation, as a function of the population type. The third, and most important result is that there is an initial overlap in the evolution of visual discrimination in the Touch and Both populations. During the first 25 generations, visual discrimination declines in both populations. After declining, visual discrimination stabilizes slightly above chance level in the Touch population, while it slowly recovers in the Both population. This separation occurs in parallel with the break between the Touch and Both populations in the evolution of overall discrimination, observed in Figure 7. Meanwhile, visual discrimination rises quickly in the See population, stabilizing at roughly 90%. The Both population does not reach the same level of visual discrimination as the See population until after nearly 450 generations. 4.3 Follow-up Questions and Analyses Both predictions are supported by the results. First, tactile discrimination emerges before visual discrimination (compare Touch and See populations, Figure 7). The same evolutionary pattern is also observed within the Both population (see below). Second, though slow to evolve, visual discrimination eventually becomes a more efficient means for identifying the good object. Compared to Touch and Both organisms, See organisms: (1) identify the good object faster, (2) reach for it more quickly, (3) do not lose contact with it more often than other organisms, and consequently, (4) reach the highest level of fitness. These findings raise a number of questions. First, why are Both organisms, with two sensory discrimination systems, not the most successful at reaching and discriminating? Second, how can the evolution of visual discrimination in the See population be accelerated? A series of follow-up analyses were performed, in order to address each of these questions. Why aren t Both organisms the best? Unlike See and Touch organisms, Both organisms have two sensory systems for discriminating between the good and bad objects. The prior analysis of separations 13

14 suggests that because these two systems overlap in function, the potential redundancy may interfere with the evolution of perceptual discrimination. Indeed, Figure 8 indicates that the emergence of visual discrimination is retarded in Both organisms, compared to See organisms. The possession of two sensory systems appears to create an additional computational task for Both organisms: in order to respond appropriately, they must evolve an intersensory coordination or integration between the tactile and visual modes of perception. There are several pieces of evidence which support this conclusion. 1. Tactile discrimination evolves first. The Both and Touch populations evolve in parallel during the first generations; the Both population subsequently diverges toward the See Population. The same pattern is found not only in overall fitness (see Figure 4), but also in both measures of discrimination (see figures 7-8). This pattern suggests that the Both population first evolves the capacity for tactile discrimination, and that visual discrimination evolves second and is subsequently coordinated with tactile discrimination. 2. Tactile discrimination maintains functionality. During contact with either object, both tactile and visual signals are simultaneously available to Both organisms. In this condition, the two signals carry the same information and are functionally equivalent. If touch is redundant with sight, is tactile discrimination eliminated or lost as vision becomes a dominant modality? If so, then eliminating the tactile signal in the final generation of the Both population should have no effect on object-directed reaching. Similarly, eliminating the visual signal should reduce fitness to 0, since the ability to discriminate is completely impaired. In order to test this hypothesis, the top 20 individuals from the final generation of the Both population were compared with and without lesions to the appropriate input connections. In each condition, reaching was tested as during evolution: for 50 epochs, with 150 movement cycles per epoch. During the first condition, no connections were lesioned. In the second and third conditions, each individual s visual or tactile input units were lesioned (i.e., the connection weights from the appropriate input units were set to 0) prior to testing. As expected, after the visual input units were lesioned, visual discrimination was reduced to chance level (51%, see Table 1). However, overall discrimination only dropped from 100% to 84%. Instead of declining to 0, average fitness in the top 20 Both organisms fell by 86% (i.e., from 4299 to 611). While the visually-lesioned Both organisms perform substantially better than None organisms, there performance is still below that of Touch organisms. These results suggest that after visual discrimination emerges in Both organisms, the functionality of the tactile sensory system remains limited, but not completely lost. After the tactile units were lesioned, fitness declined to 2149 (a 50% decrease). This drop in fitness is associated with an increased number of separations between the hand and the good object (71% more separations after the tactile lesions). However, 5 additional separations do not account for over 2000 cycles of lost contact with the good object. Why does lesioning the tactile units cause fitness to drop by half? A tentative answer to this question comes from observing the behavior of Both organisms with tactile lesions. These organisms seem to have no trouble locating the good object and reaching for it. After contacting the good object, though, they appear to pull their hand off of it, as if surprised. Indeed, this is a sensory condition never experienced during evolution, and it appears to disrupt the coordination of reaching behaviors. These anecdotal observations are supported by two additional measures in Table 1. Distance off is defined as the distance moved by the hand, after each separation with the good object. Cycles off is defined as the number of cycles spent off the good object, after a separation. As Table 1 indicates, when the tactile units are lesioned: (1) the hand and good object are 14

15 Type of Lesion No Lesions Visual Lesion Tactile Lesion Fitness (-86%) 2149 (-50%) Latency (105%) 39 (0%) Separations 7 1 (-81%) 12 (71%) Overall (-16%) 0.99 (0%) Discrimination Visual Discrimination (-42%) 0.88 (0%) Distance Off (-8%) 4.8 (64%) Cycles Off (839%) 12.1 (45%) Table 1. Average fitness and component measure performance in the top 20 individuals from the final generation of the Both population (averaged across 10 seeds), before and after lesioning the tactile or visual input signals (percent change in parentheses). separated more often; (2) the hand moves farther off the good object during separations; and (3) the hand takes more cycles to regain contact with the good object. Taken together, these results support the conclusion that the tactile units do not lose their functionality during evolution, and that visual discrimination does not replace tactile discrimination. 3. Tactile input signals are not turned off. Why is a partially redundant tactile signal not eliminated? One possible reason is that it may be difficult for organisms in the Both population to eliminate connections from the tactile input units (i.e., to reduce the strength of these connection weights to 0) after the capacity to discriminate by touch has evolved. However, what if the activity of the tactile Percent of Parents as a Function of Population % of top 20 individuals generations See Both Figure 9. Average percent of the top 20 individuals from each generation, as a function of population. Perception of the tactile input signal was subject to mutation. There were an equal number of See and Both individuals in the first generation. 15

16 modality were subject to mutation? In other words, if the entire tactile input system could be turned on or off by mutation, would the Both population evolve the tendency to turn off the tactile channel as visual discrimination improved? In order to test this second hypothesis, five new replications of the Both population were simulated, with one additional feature: each organism was given an additional gene, subject to mutation, which determines whether the tactile modality is functional or not. Thus, when this gene is on, the organism functions like an individual from the Both population, and when off, like an individual from the See population. Three conditions were simulated. In the first, the initial generation consisted of only Both organisms; in the second, 50 were Both organisms and 50 were See organisms: and in the third, all of the organisms in the first generation were from the See population. In all cases, the capacity to perceive the tactile input signal was subject to mutation; that is, the gene encoding the activity of the tactile system had a 10% probability of switching from off to on, or vice versa. Via mutation, a parent could generate an offspring from either population. The pattern of results from all three conditions was consistent. Figure 9 presents the average percent of the top 20 individuals during each generation, as a function of population, in the second condition (i.e., an even mix of Both and See organisms in the first generation). The results are also consistent with the previous findings. First, during the initial 25 generations, nearly all of the top 20 individuals were from the Both population. However, the percent of parents from the Both population subsequently declined, falling close to 50% by the 500th generation. This pattern supports the general finding that tactile discrimination is preferred over visual discrimination during the initial generations. Second, although the percent of parents from the See population steadily increases, it never becomes the dominant population. This result suggests that when given the chance to deactivate the tactile sensory channel, a substantial number of Both organisms instead tend to maintain both the visual and tactile sensory modalities in a coordinated fashion. Thus, tactile discrimination emerges before visual discrimination in the Both population, maintains its functionality throughout evolution, and is not eliminated in Both organisms when subject to mutation. These supplemental findings, taken together with the earlier results, support the conclusion that the evolution of visual discrimination in Both organisms is retarded by the additional computational task of evolving a sensory coordination between tactile and visual discrimination. Can the evolution of visual discrimination be accelerated? A second question concerns the rate of visual discrimination evolution in the See population. As Figure 1 indicates, use of the visual signal for determining the appropriate behavior (e.g., approach, leave, etc.) requires not only seeing an object, but also perceiving whether or not the hand and object are in contact. Note that these two pieces of information are combined in the tactile signal. As discussed earlier, an active tactile signal identifies not only contact with an object, but also the perceptual properties of the object. How does a See organism know when it is touching an object, if it does not possess a tactile input signal like Touch and Both organisms? The answer to this question is found by considering the organisms studied by Di Sano (1993). Organisms in this simulation were presented only with the position of their hand and the object (like the None population in the present study). Contact can be computed from these input values by comparing the relative angles and distances of the hand and object: when the hand contacts the object, the respective input values are equal or near-equal. These same four input signals were available to all organisms in the present study. For None and See organisms, this derived signal is the only basis for perceiving contact. For Touch and Both organisms, meanwhile, the tactile input signal functions as a second means for perceiving contact. 16

17 These observations suggest an additional reason for why visual discrimination in See organisms may evolve more slowly than tactile discrimination in Touch organisms: because visual discrimination requires the capacity to perceive hand-object contact, which See organisms do not initially possess but instead must evolve. If this explanation is correct, then providing See organisms with a direct input that signals contact (circumventing the need for the derived signal) should accelerate the evolution of visual discrimination. However, the prior analysis of separations in each population suggests that providing See organisms with a contact signal may also slow the evolution of the "WHERE" system for perceiving contact. This observation leads to the counterintuitive prediction that while a contact signal may facilitate visual discrimination in See organisms, it should also increase the number of separations in this population, compared to the original See organisms. In order to test these hypotheses, the See+ population was constructed: See+ organisms have the same 6 inputs units as their See cousins (i.e., 4 WHERE units and 2 WHAT units), in addition to a seventh unit which receives an input of <1> when either the good or bad object is touched, and <0> otherwise. In order to compare the rates of visual discrimination in the See and See+ populations, Figure 10 presents the average percent of first touches to the good object for the top 20 individuals in each generation (as before, averaged across 10 replications). As expected, visual discrimination evolves more rapidly in See+ organisms. While the evolutionary trajectories are generally close, the See+ population averages nearly 5% more first touches to the good object, between generations 50 and 300. After generation 300, visual discrimination in the two populations is equivalent. Figure 11 presents the proportion of separations, per 10 touches to the good object, for the top 20 individuals in the See and See+ populations. As predicted, the proportion of separations is higher in the See+ population for roughly the first 100 generations. However, as Figure 11 indicates, the two populations then converge and evolve in parallel through generation 500. Average Visual Discrimination as a Function of Population 1 average visual discrimination See See+ generation Figure 10. Average visual discrimination (across 10 replications) of the top 20 individuals in each generation, as a function of the population type. 17

18 10 Average Separation as a Function of Population (Number of Separations per 10 Touches to Good Object) average # of separations See See generation Figure 11. Average number of separations (across 10 replications) of the top 20 individuals in each generation, per every 10 touches to the good object, as a function of the population type. No other major differences emerge in the evolution of the See and See+ populations. Why is fitness not higher in See+ than See organisms, given their accelerated evolution of visual discrimination? The answer to this question is that a 5% advantage in visual discrimination means that See+ organisms contact the good object first 2 or 3 times more often than See organisms. In other words, a substantial acceleration in the evolution of visual discrimination is associated with only a small change in fitness. 5. Conclusions and Discussion As predicted, proximal sensory discrimination emerges before distal sensory discrimination in the evolution of object-directed reaching. However, distal sensory discrimination eventually becomes the superior mode for guiding reaching behaviors in populations of artificial neural-network organisms. These general results can be elaborated in several ways. First, why is visual discrimination associated with the highest levels of fitness? As Figure 5 indicates, latency does not appear to vary as a function of the sensory discrimination system: Touch, See, and Both organisms make first contact with an object at comparable rates. Figure 6, however, suggests that the ability to maintain contact does vary as a function of the sensory discrimination system: the visual sensory system facilitates, while the tactile sensory system ultimately interferes with, evolving the capacity to maintain contact (see Redundant Functions, below). Second, although visual discrimination evolves more slowly than tactile discrimination, it is also a more efficient means for identifying the good object. When See and Both organisms see the bad object, they move both their arm and visual field away almost immediately. Touch organisms, meanwhile, approach the bad object (which, in generation 500, takes an average of 30 cycles), before identifying it; they subsequently move away, in search of the good object. This difference in search time accounts for most of the gap in fitness between Touch and See organisms. 18

19 5.1 Evolution and Redundant Functions A tactile sensory system has both advantages and disadvantages for the evolution of objectdirected reaching. The evolution of discrimination between good and bad objects, and maintaining contact with the good object, are both initially accelerated in organisms with a tactile sensory system. However, the tactile sensory system also delays the emergence of other systems which can perform these functions: the "WHERE" system, in the case of maintaining contact, and the visual sensory system, in the case of discrimination. As discussed in section 3.2, visual and tactile discrimination systems differ along several dimensions, in the natural environment. The present findings suggest an additional important difference between visual and tactile discrimination. While the tactile sensory system may facilitate the early evolution of object-directed reaching, it also interferes with other systems that initially evolve more slowly, but which would otherwise go on to perform the same function more successfully. However, as the supplementary analyses indicate, once tactile discrimination has emerged, the tactile sensory system tends to retain its functionality even after alternative systems begin to function. The tactile sensory system, though redundant with other systems, is not eliminated. In contrast, organisms with only a visual "WHERE" system and a visual "WHAT" system possess the minimally necessary sensory structures for evolving object-directed reaching and discrimination (i.e., no computational redundancies). Consequently, these organisms evolve more slowly at first, but eventually reach the most adaptive level of functioning. This pattern of findings suggests that the possession of potentially redundant sensory systems creates the benefit of an early acceleration in evolution, which may be outweighed by the later costs. 5.2 Evolution versus Development of Reaching The present model focuses on changes in discrimination and reaching at the population level. Consequently, a genetic algorithm was employed in order to train a population of artificial organisms to reach. Several researchers have also considered a similar variability-and-selection procedure as a developmental process, at the individual level (e.g., Edelman, 1987; Rosenbaum, Loukopoulos, Mulenbroek, Vaughan, & Engelbrecht, 1995; Siegler, 1994). Developmental theories of reaching which are consistent with this view have proposed that infants actively explore alternative action patterns (Berthier, 1995; Sporns & Edelman, 1993; von Hofsten and Ronnqvist, 1993; Thelen, Corbetta, Kamm, Spencer, Schneider, and Zernicke, 1993). The analogy between evolutionary and developmental processes raises the question of recapitulation: are there aspects of reaching development which mirror the pattern observed in reaching evolution? Direct comparisons between the development of reaching in human infants and the results of the present model are limited, due to a number of differences between the two domains (e.g., biomechanical constraints, nervous system complexity, etc.). However, an important feature shared by both domains is the emergence of multi-modal perception. During the evolution of object-directed reaching in artificiallife organisms, discrimination emerges first by touch, and then by sight. In organisms with both tactile and visual sensory systems, visual discrimination appears to ultimately play a more important role during reaching. However, there is a strong tendency to maintain the functionality of both of these systems. Similarly, during early infancy, the development of haptic (i.e., oral) object exploration emerges before visual object exploration. However, oral exploration subsequently declines, and by age 12 months visual perception becomes the dominant modality for object recognition and discrimination in humans (Adolph, Eppler, and Gibson, 1993). 19

Artificial organisms that sleep

Artificial organisms that sleep Artificial organisms that sleep Marco Mirolli 1,2, Domenico Parisi 1 1 Institute of Cognitive Sciences and Technologies, National Research Council Viale Marx 15, 137, Rome, Italy parisi@ip.rm.cnr.it 2

More information

Learning Classifier Systems (LCS/XCSF)

Learning Classifier Systems (LCS/XCSF) Context-Dependent Predictions and Cognitive Arm Control with XCSF Learning Classifier Systems (LCS/XCSF) Laurentius Florentin Gruber Seminar aus Künstlicher Intelligenz WS 2015/16 Professor Johannes Fürnkranz

More information

Theoretical Neuroscience: The Binding Problem Jan Scholz, , University of Osnabrück

Theoretical Neuroscience: The Binding Problem Jan Scholz, , University of Osnabrück The Binding Problem This lecture is based on following articles: Adina L. Roskies: The Binding Problem; Neuron 1999 24: 7 Charles M. Gray: The Temporal Correlation Hypothesis of Visual Feature Integration:

More information

Reactive agents and perceptual ambiguity

Reactive agents and perceptual ambiguity Major theme: Robotic and computational models of interaction and cognition Reactive agents and perceptual ambiguity Michel van Dartel and Eric Postma IKAT, Universiteit Maastricht Abstract Situated and

More information

Perception. Chapter 8, Section 3

Perception. Chapter 8, Section 3 Perception Chapter 8, Section 3 Principles of Perceptual Organization The perception process helps us to comprehend the confusion of the stimuli bombarding our senses Our brain takes the bits and pieces

More information

Principals of Object Perception

Principals of Object Perception Principals of Object Perception Elizabeth S. Spelke COGNITIVE SCIENCE 14, 29-56 (1990) Cornell University Summary Infants perceive object by analyzing tree-dimensional surface arrangements and motions.

More information

The Role of Action in Object Categorization

The Role of Action in Object Categorization From: FLAIRS-02 Proceedings. Copyright 2002, AAAI (www.aaai.org). All rights reserved. The Role of Action in Object Categorization Andrea Di Ferdinando* Anna M. Borghi^ Domenico Parisi* *Institute of Cognitive

More information

The evolution of pain

The evolution of pain The evolution of pain Alberto Acerbi 1, Domenico Parisi 1 1 Institute of Cognitive Sciences and Technologies, National Research Council 44, Via San Martino della Battaglia, 00185, Rome, Italy {alberto.acerbi,

More information

What utility is there in distinguishing between active and passive touch? Jack M. Loomis and Susan J. Lederman

What utility is there in distinguishing between active and passive touch? Jack M. Loomis and Susan J. Lederman What utility is there in distinguishing between active and passive touch? Jack M. Loomis and Susan J. Lederman University of California, Santa Barbara and Queen's University Paper presented at the Psychonomic

More information

Discrimination and Generalization in Pattern Categorization: A Case for Elemental Associative Learning

Discrimination and Generalization in Pattern Categorization: A Case for Elemental Associative Learning Discrimination and Generalization in Pattern Categorization: A Case for Elemental Associative Learning E. J. Livesey (el253@cam.ac.uk) P. J. C. Broadhurst (pjcb3@cam.ac.uk) I. P. L. McLaren (iplm2@cam.ac.uk)

More information

Key words: Perceptual development, depth from motion, texture perception, visual development, motor development

Key words: Perceptual development, depth from motion, texture perception, visual development, motor development Swiss Journal of Psychology 59 (2), 2000, 102 107 Visuo-motor development which causes detection of visual depth from motion and density cues 1 Keiichiro Tsuji 1, Keikichi Hayashibe 2 Masatoshi Hara 3

More information

Learning to Use Episodic Memory

Learning to Use Episodic Memory Learning to Use Episodic Memory Nicholas A. Gorski (ngorski@umich.edu) John E. Laird (laird@umich.edu) Computer Science & Engineering, University of Michigan 2260 Hayward St., Ann Arbor, MI 48109 USA Abstract

More information

Physiology of Tactile Sensation

Physiology of Tactile Sensation Physiology of Tactile Sensation Objectives: 1. Describe the general structural features of tactile sensory receptors how are first order nerve fibers specialized to receive tactile stimuli? 2. Understand

More information

Time Experiencing by Robotic Agents

Time Experiencing by Robotic Agents Time Experiencing by Robotic Agents Michail Maniadakis 1 and Marc Wittmann 2 and Panos Trahanias 1 1- Foundation for Research and Technology - Hellas, ICS, Greece 2- Institute for Frontier Areas of Psychology

More information

5.8 Departure from cognitivism: dynamical systems

5.8 Departure from cognitivism: dynamical systems 154 consciousness, on the other, was completely severed (Thompson, 2007a, p. 5). Consequently as Thompson claims cognitivism works with inadequate notion of cognition. This statement is at odds with practical

More information

Competing Frameworks in Perception

Competing Frameworks in Perception Competing Frameworks in Perception Lesson II: Perception module 08 Perception.08. 1 Views on perception Perception as a cascade of information processing stages From sensation to percept Template vs. feature

More information

Competing Frameworks in Perception

Competing Frameworks in Perception Competing Frameworks in Perception Lesson II: Perception module 08 Perception.08. 1 Views on perception Perception as a cascade of information processing stages From sensation to percept Template vs. feature

More information

A model of parallel time estimation

A model of parallel time estimation A model of parallel time estimation Hedderik van Rijn 1 and Niels Taatgen 1,2 1 Department of Artificial Intelligence, University of Groningen Grote Kruisstraat 2/1, 9712 TS Groningen 2 Department of Psychology,

More information

Oscillatory Neural Network for Image Segmentation with Biased Competition for Attention

Oscillatory Neural Network for Image Segmentation with Biased Competition for Attention Oscillatory Neural Network for Image Segmentation with Biased Competition for Attention Tapani Raiko and Harri Valpola School of Science and Technology Aalto University (formerly Helsinki University of

More information

2012 Course : The Statistician Brain: the Bayesian Revolution in Cognitive Science

2012 Course : The Statistician Brain: the Bayesian Revolution in Cognitive Science 2012 Course : The Statistician Brain: the Bayesian Revolution in Cognitive Science Stanislas Dehaene Chair in Experimental Cognitive Psychology Lecture No. 4 Constraints combination and selection of a

More information

Natural Scene Statistics and Perception. W.S. Geisler

Natural Scene Statistics and Perception. W.S. Geisler Natural Scene Statistics and Perception W.S. Geisler Some Important Visual Tasks Identification of objects and materials Navigation through the environment Estimation of motion trajectories and speeds

More information

A HMM-based Pre-training Approach for Sequential Data

A HMM-based Pre-training Approach for Sequential Data A HMM-based Pre-training Approach for Sequential Data Luca Pasa 1, Alberto Testolin 2, Alessandro Sperduti 1 1- Department of Mathematics 2- Department of Developmental Psychology and Socialisation University

More information

Learning and Adaptive Behavior, Part II

Learning and Adaptive Behavior, Part II Learning and Adaptive Behavior, Part II April 12, 2007 The man who sets out to carry a cat by its tail learns something that will always be useful and which will never grow dim or doubtful. -- Mark Twain

More information

EDGE DETECTION. Edge Detectors. ICS 280: Visual Perception

EDGE DETECTION. Edge Detectors. ICS 280: Visual Perception EDGE DETECTION Edge Detectors Slide 2 Convolution & Feature Detection Slide 3 Finds the slope First derivative Direction dependent Need many edge detectors for all orientation Second order derivatives

More information

Introduction to Computational Neuroscience

Introduction to Computational Neuroscience Introduction to Computational Neuroscience Lecture 11: Attention & Decision making Lesson Title 1 Introduction 2 Structure and Function of the NS 3 Windows to the Brain 4 Data analysis 5 Data analysis

More information

Learning Utility for Behavior Acquisition and Intention Inference of Other Agent

Learning Utility for Behavior Acquisition and Intention Inference of Other Agent Learning Utility for Behavior Acquisition and Intention Inference of Other Agent Yasutake Takahashi, Teruyasu Kawamata, and Minoru Asada* Dept. of Adaptive Machine Systems, Graduate School of Engineering,

More information

Lecture 6. Perceptual and Motor Schemas

Lecture 6. Perceptual and Motor Schemas CS564 - Brain Theory and Artificial Intelligence Lecture 6. Perceptual and Motor Reading Assignments: TMB2:* Sections 2.1, 2.2, 5.1 and 5.2. HBTNN: Schema Theory (Arbib) [Also required] Distributed Artificial

More information

Changing expectations about speed alters perceived motion direction

Changing expectations about speed alters perceived motion direction Current Biology, in press Supplemental Information: Changing expectations about speed alters perceived motion direction Grigorios Sotiropoulos, Aaron R. Seitz, and Peggy Seriès Supplemental Data Detailed

More information

The Neuroscience of Vision III

The Neuroscience of Vision III The Neuroscience of Vision III Putting the Pieces Together Drawing upon the apparent differences in processing in the temporal lobe (object identification) and parietal lobe (spatial processing), Ungerleider

More information

Categories Formation in Self-Organizing Embodied Agents

Categories Formation in Self-Organizing Embodied Agents Categories Formation in Self-Organizing Embodied Agents Stefano Nolfi Institute of Cognitive Sciences and Technologies National Research Council (CNR) Viale Marx, 15, 00137, Rome, Italy s.nolfi@istc.cnr.it

More information

12/18/2013. Observational Methods. Field Research. Observational Studies. Observational Studies

12/18/2013. Observational Methods. Field Research. Observational Studies. Observational Studies al Methods Field Research Research involving the direct observation of behavior. Three decisions to make: Will the observation occur in a natural or contrived setting? Will the participants know they are

More information

Sensory Cue Integration

Sensory Cue Integration Sensory Cue Integration Summary by Byoung-Hee Kim Computer Science and Engineering (CSE) http://bi.snu.ac.kr/ Presentation Guideline Quiz on the gist of the chapter (5 min) Presenters: prepare one main

More information

Learning to classify integral-dimension stimuli

Learning to classify integral-dimension stimuli Psychonomic Bulletin & Review 1996, 3 (2), 222 226 Learning to classify integral-dimension stimuli ROBERT M. NOSOFSKY Indiana University, Bloomington, Indiana and THOMAS J. PALMERI Vanderbilt University,

More information

Katsunari Shibata and Tomohiko Kawano

Katsunari Shibata and Tomohiko Kawano Learning of Action Generation from Raw Camera Images in a Real-World-Like Environment by Simple Coupling of Reinforcement Learning and a Neural Network Katsunari Shibata and Tomohiko Kawano Oita University,

More information

Auditory Scene Analysis

Auditory Scene Analysis 1 Auditory Scene Analysis Albert S. Bregman Department of Psychology McGill University 1205 Docteur Penfield Avenue Montreal, QC Canada H3A 1B1 E-mail: bregman@hebb.psych.mcgill.ca To appear in N.J. Smelzer

More information

Lesson 6 Learning II Anders Lyhne Christensen, D6.05, INTRODUCTION TO AUTONOMOUS MOBILE ROBOTS

Lesson 6 Learning II Anders Lyhne Christensen, D6.05, INTRODUCTION TO AUTONOMOUS MOBILE ROBOTS Lesson 6 Learning II Anders Lyhne Christensen, D6.05, anders.christensen@iscte.pt INTRODUCTION TO AUTONOMOUS MOBILE ROBOTS First: Quick Background in Neural Nets Some of earliest work in neural networks

More information

SUPPLEMENTAL MATERIAL

SUPPLEMENTAL MATERIAL 1 SUPPLEMENTAL MATERIAL Response time and signal detection time distributions SM Fig. 1. Correct response time (thick solid green curve) and error response time densities (dashed red curve), averaged across

More information

An Artificial Neural Network Architecture Based on Context Transformations in Cortical Minicolumns

An Artificial Neural Network Architecture Based on Context Transformations in Cortical Minicolumns An Artificial Neural Network Architecture Based on Context Transformations in Cortical Minicolumns 1. Introduction Vasily Morzhakov, Alexey Redozubov morzhakovva@gmail.com, galdrd@gmail.com Abstract Cortical

More information

Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence

Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence To understand the network paradigm also requires examining the history

More information

MEMORY MODELS. CHAPTER 5: Memory models Practice questions - text book pages TOPIC 23

MEMORY MODELS. CHAPTER 5: Memory models Practice questions - text book pages TOPIC 23 TOPIC 23 CHAPTER 65 CHAPTER 5: Memory models Practice questions - text book pages 93-94 1) Identify the three main receptor systems used by a performer in sport. Where is the filtering mechanism found

More information

Object Substitution Masking: When does Mask Preview work?

Object Substitution Masking: When does Mask Preview work? Object Substitution Masking: When does Mask Preview work? Stephen W. H. Lim (psylwhs@nus.edu.sg) Department of Psychology, National University of Singapore, Block AS6, 11 Law Link, Singapore 117570 Chua

More information

A Computational Model For Action Prediction Development

A Computational Model For Action Prediction Development A Computational Model For Action Prediction Development Serkan Bugur 1, Yukie Nagai 3, Erhan Oztop 2, and Emre Ugur 1 1 Bogazici University, Istanbul, Turkey. 2 Ozyegin University, Istanbul, Turkey. 3

More information

Sensation vs. Perception

Sensation vs. Perception PERCEPTION Sensation vs. Perception What s the difference? Sensation what the senses do Perception process of recognizing, organizing and dinterpreting ti information. What is Sensation? The process whereby

More information

Lateral Geniculate Nucleus (LGN)

Lateral Geniculate Nucleus (LGN) Lateral Geniculate Nucleus (LGN) What happens beyond the retina? What happens in Lateral Geniculate Nucleus (LGN)- 90% flow Visual cortex Information Flow Superior colliculus 10% flow Slide 2 Information

More information

LEAH KRUBITZER RESEARCH GROUP LAB PUBLICATIONS WHAT WE DO LINKS CONTACTS

LEAH KRUBITZER RESEARCH GROUP LAB PUBLICATIONS WHAT WE DO LINKS CONTACTS LEAH KRUBITZER RESEARCH GROUP LAB PUBLICATIONS WHAT WE DO LINKS CONTACTS WHAT WE DO Present studies and future directions Our laboratory is currently involved in two major areas of research. The first

More information

Cognitive Modelling Themes in Neural Computation. Tom Hartley

Cognitive Modelling Themes in Neural Computation. Tom Hartley Cognitive Modelling Themes in Neural Computation Tom Hartley t.hartley@psychology.york.ac.uk Typical Model Neuron x i w ij x j =f(σw ij x j ) w jk x k McCulloch & Pitts (1943), Rosenblatt (1957) Net input:

More information

Introduction to Biological Anthropology: Notes 12 Mating: Primate females and males Copyright Bruce Owen 2009 We want to understand the reasons

Introduction to Biological Anthropology: Notes 12 Mating: Primate females and males Copyright Bruce Owen 2009 We want to understand the reasons Introduction to Biological Anthropology: Notes 12 Mating: Primate females and males Copyright Bruce Owen 2009 We want to understand the reasons behind the lifestyles of our non-human primate relatives

More information

Target-to-distractor similarity can help visual search performance

Target-to-distractor similarity can help visual search performance Target-to-distractor similarity can help visual search performance Vencislav Popov (vencislav.popov@gmail.com) Lynne Reder (reder@cmu.edu) Department of Psychology, Carnegie Mellon University, Pittsburgh,

More information

Competition Between Objective and Novelty Search on a Deceptive Task

Competition Between Objective and Novelty Search on a Deceptive Task Competition Between Objective and Novelty Search on a Deceptive Task Billy Evers and Michael Rubayo Abstract It has been proposed, and is now widely accepted within use of genetic algorithms that a directly

More information

Biologically-Inspired Human Motion Detection

Biologically-Inspired Human Motion Detection Biologically-Inspired Human Motion Detection Vijay Laxmi, J. N. Carter and R. I. Damper Image, Speech and Intelligent Systems (ISIS) Research Group Department of Electronics and Computer Science University

More information

Grace Iarocci Ph.D., R. Psych., Associate Professor of Psychology Simon Fraser University

Grace Iarocci Ph.D., R. Psych., Associate Professor of Psychology Simon Fraser University Grace Iarocci Ph.D., R. Psych., Associate Professor of Psychology Simon Fraser University Theoretical perspective on ASD Where does attention and perception fit within the diagnostic triad of impairments?

More information

The evolution of cooperative turn-taking in animal conflict

The evolution of cooperative turn-taking in animal conflict RESEARCH ARTICLE Open Access The evolution of cooperative turn-taking in animal conflict Mathias Franz 1*, Daniel van der Post 1,2,3, Oliver Schülke 1 and Julia Ostner 1 Abstract Background: A fundamental

More information

Lecture 2.1 What is Perception?

Lecture 2.1 What is Perception? Lecture 2.1 What is Perception? A Central Ideas in Perception: Perception is more than the sum of sensory inputs. It involves active bottom-up and topdown processing. Perception is not a veridical representation

More information

(Visual) Attention. October 3, PSY Visual Attention 1

(Visual) Attention. October 3, PSY Visual Attention 1 (Visual) Attention Perception and awareness of a visual object seems to involve attending to the object. Do we have to attend to an object to perceive it? Some tasks seem to proceed with little or no attention

More information

Dynamics of Color Category Formation and Boundaries

Dynamics of Color Category Formation and Boundaries Dynamics of Color Category Formation and Boundaries Stephanie Huette* Department of Psychology, University of Memphis, Memphis, TN Definition Dynamics of color boundaries is broadly the area that characterizes

More information

Congruency Effects with Dynamic Auditory Stimuli: Design Implications

Congruency Effects with Dynamic Auditory Stimuli: Design Implications Congruency Effects with Dynamic Auditory Stimuli: Design Implications Bruce N. Walker and Addie Ehrenstein Psychology Department Rice University 6100 Main Street Houston, TX 77005-1892 USA +1 (713) 527-8101

More information

Models of Parent-Offspring Conflict Ethology and Behavioral Ecology

Models of Parent-Offspring Conflict Ethology and Behavioral Ecology Models of Parent-Offspring Conflict Ethology and Behavioral Ecology A. In this section we will look the nearly universal conflict that will eventually arise in any species where there is some form of parental

More information

Categorical Perception

Categorical Perception Categorical Perception Discrimination for some speech contrasts is poor within phonetic categories and good between categories. Unusual, not found for most perceptual contrasts. Influenced by task, expectations,

More information

Incorporation of Imaging-Based Functional Assessment Procedures into the DICOM Standard Draft version 0.1 7/27/2011

Incorporation of Imaging-Based Functional Assessment Procedures into the DICOM Standard Draft version 0.1 7/27/2011 Incorporation of Imaging-Based Functional Assessment Procedures into the DICOM Standard Draft version 0.1 7/27/2011 I. Purpose Drawing from the profile development of the QIBA-fMRI Technical Committee,

More information

A Race Model of Perceptual Forced Choice Reaction Time

A Race Model of Perceptual Forced Choice Reaction Time A Race Model of Perceptual Forced Choice Reaction Time David E. Huber (dhuber@psych.colorado.edu) Department of Psychology, 1147 Biology/Psychology Building College Park, MD 2742 USA Denis Cousineau (Denis.Cousineau@UMontreal.CA)

More information

The Role of Feedback in Categorisation

The Role of Feedback in Categorisation The Role of in Categorisation Mark Suret (m.suret@psychol.cam.ac.uk) Department of Experimental Psychology; Downing Street Cambridge, CB2 3EB UK I.P.L. McLaren (iplm2@cus.cam.ac.uk) Department of Experimental

More information

Development of Prototype Abstraction and Exemplar Memorization

Development of Prototype Abstraction and Exemplar Memorization Baetu, I., & Shultz, T. R. (2010). Development of prototype abstraction and exemplar memorization. In S. Ohlsson & R. Catrambone (Eds.), Proceedings of the 32nd Annual Conference of the Cognitive Science

More information

DEVELOPMENT OF THE MOTOR SYSTEM

DEVELOPMENT OF THE MOTOR SYSTEM DEVELOPMENT OF THE MOTOR SYSTEM HDP1: Fall 2007 Joan Stiles Department of Cognitive Science University of California, San Diego Motor system development begins during the Prenatal period Thalamocortical

More information

A Race Model of Perceptual Forced Choice Reaction Time

A Race Model of Perceptual Forced Choice Reaction Time A Race Model of Perceptual Forced Choice Reaction Time David E. Huber (dhuber@psyc.umd.edu) Department of Psychology, 1147 Biology/Psychology Building College Park, MD 2742 USA Denis Cousineau (Denis.Cousineau@UMontreal.CA)

More information

CHAPTER 6: Memory model Practice questions at - text book pages 112 to 113

CHAPTER 6: Memory model Practice questions at - text book pages 112 to 113 QUESTIONS AND ANSWERS QUESTIONS AND ANSWERS CHAPTER 6: Memory model Practice questions at - text book pages 112 to 113 1) Which of the following sequences reflects the order in which the human brain processes

More information

UMEÅ PSYCHOLOGICAL REPORTS

UMEÅ PSYCHOLOGICAL REPORTS F "3 UMEÅ PSYCHOLOGICAL REPORTS No. 95 1975 Department of Psychology University of Utoeå S 901 87 Umeå/Sweden A NOTE ON IHK5FMA1Ï0N PROCESSING IN CROSS-MODAL MATCHING Jörgen Garvill Bo blander A NOTE ON

More information

http://www.diva-portal.org This is the published version of a paper presented at Future Active Safety Technology - Towards zero traffic accidents, FastZero2017, September 18-22, 2017, Nara, Japan. Citation

More information

Users. Perception and Cognition

Users. Perception and Cognition Users Perception and Cognition This lecture relies on Designing with the Mind in Mind by Jeff Johnson, The Humane Interface by Jef Raskin, and other sources. 1 Ergonomics and Human Factors Designing devices

More information

SHORT AND LONG MEMORIES IN OCTOPUS AND THE INFLUENCE OF THE VERTICAL LOBE SYSTEM

SHORT AND LONG MEMORIES IN OCTOPUS AND THE INFLUENCE OF THE VERTICAL LOBE SYSTEM J. Exp. Biol. (1970), 53. 385-393 385 With 4 text-figures fprinted in Great Britain SHORT AND LONG MEMORIES IN OCTOPUS AND THE INFLUENCE OF THE VERTICAL LOBE SYSTEM BY J. Z. YOUNG Department of Anatomy,

More information

Introduction to the Special Issue on Multimodality of Early Sensory Processing: Early Visual Maps Flexibly Encode Multimodal Space

Introduction to the Special Issue on Multimodality of Early Sensory Processing: Early Visual Maps Flexibly Encode Multimodal Space BRILL Multisensory Research 28 (2015) 249 252 brill.com/msr Introduction to the Special Issue on Multimodality of Early Sensory Processing: Early Visual Maps Flexibly Encode Multimodal Space Roberto Arrighi1,

More information

Stimulus any aspect of or change in the environment to which an organism responds. Sensation what occurs when a stimulus activates a receptor

Stimulus any aspect of or change in the environment to which an organism responds. Sensation what occurs when a stimulus activates a receptor Chapter 8 Sensation and Perception Sec 1: Sensation Stimulus any aspect of or change in the environment to which an organism responds Sensation what occurs when a stimulus activates a receptor Perception

More information

Coordination in Sensory Integration

Coordination in Sensory Integration 15 Coordination in Sensory Integration Jochen Triesch, Constantin Rothkopf, and Thomas Weisswange Abstract Effective perception requires the integration of many noisy and ambiguous sensory signals across

More information

Decline of the McCollough effect by orientation-specific post-adaptation exposure to achromatic gratings

Decline of the McCollough effect by orientation-specific post-adaptation exposure to achromatic gratings *Manuscript Click here to view linked References Decline of the McCollough effect by orientation-specific post-adaptation exposure to achromatic gratings J. Bulthé, H. Op de Beeck Laboratory of Biological

More information

Classification and Statistical Analysis of Auditory FMRI Data Using Linear Discriminative Analysis and Quadratic Discriminative Analysis

Classification and Statistical Analysis of Auditory FMRI Data Using Linear Discriminative Analysis and Quadratic Discriminative Analysis International Journal of Innovative Research in Computer Science & Technology (IJIRCST) ISSN: 2347-5552, Volume-2, Issue-6, November-2014 Classification and Statistical Analysis of Auditory FMRI Data Using

More information

Increasing Spatial Competition Enhances Visual Prediction Learning

Increasing Spatial Competition Enhances Visual Prediction Learning (2011). In A. Cangelosi, J. Triesch, I. Fasel, K. Rohlfing, F. Nori, P.-Y. Oudeyer, M. Schlesinger, Y. and Nagai (Eds.), Proceedings of the First Joint IEEE Conference on Development and Learning and on

More information

Framework for Comparative Research on Relational Information Displays

Framework for Comparative Research on Relational Information Displays Framework for Comparative Research on Relational Information Displays Sung Park and Richard Catrambone 2 School of Psychology & Graphics, Visualization, and Usability Center (GVU) Georgia Institute of

More information

Navigation: Inside the Hippocampus

Navigation: Inside the Hippocampus 9.912 Computational Visual Cognition Navigation: Inside the Hippocampus Jakob Voigts 3 Nov 2008 Functions of the Hippocampus: Memory and Space Short term memory Orientation Memory consolidation(h.m)

More information

Fundamentals of Psychophysics

Fundamentals of Psychophysics Fundamentals of Psychophysics John Greenwood Department of Experimental Psychology!! NEUR3045! Contact: john.greenwood@ucl.ac.uk 1 Visual neuroscience physiology stimulus How do we see the world? neuroimaging

More information

Neural Correlates of Human Cognitive Function:

Neural Correlates of Human Cognitive Function: Neural Correlates of Human Cognitive Function: A Comparison of Electrophysiological and Other Neuroimaging Approaches Leun J. Otten Institute of Cognitive Neuroscience & Department of Psychology University

More information

Toward Minimally Social Behavior: Social Psychology Meets Evolutionary Robotics

Toward Minimally Social Behavior: Social Psychology Meets Evolutionary Robotics Toward Minimally Social Behavior: Social Psychology Meets Evolutionary Robotics Tom Froese and Ezequiel A. Di Paolo CCNR, University of Sussex, Brighton, UK t.froese@gmail.com, ezequiel@sussex.ac.uk Abstract.

More information

Convergence Principles: Information in the Answer

Convergence Principles: Information in the Answer Convergence Principles: Information in the Answer Sets of Some Multiple-Choice Intelligence Tests A. P. White and J. E. Zammarelli University of Durham It is hypothesized that some common multiplechoice

More information

A Memory Model for Decision Processes in Pigeons

A Memory Model for Decision Processes in Pigeons From M. L. Commons, R.J. Herrnstein, & A.R. Wagner (Eds.). 1983. Quantitative Analyses of Behavior: Discrimination Processes. Cambridge, MA: Ballinger (Vol. IV, Chapter 1, pages 3-19). A Memory Model for

More information

What is mid level vision? Mid Level Vision. What is mid level vision? Lightness perception as revealed by lightness illusions

What is mid level vision? Mid Level Vision. What is mid level vision? Lightness perception as revealed by lightness illusions What is mid level vision? Mid Level Vision March 18, 2004 Josh McDermott Perception involves inferring the structure of the world from measurements of energy generated by the world (in vision, this is

More information

PSYC 441 Cognitive Psychology II

PSYC 441 Cognitive Psychology II PSYC 441 Cognitive Psychology II Session 4 Background of Object Recognition Lecturer: Dr. Benjamin Amponsah, Dept., of Psychology, UG, Legon Contact Information: bamponsah@ug.edu.gh College of Education

More information

Evolution of Plastic Sensory-motor Coupling and Dynamic Categorization

Evolution of Plastic Sensory-motor Coupling and Dynamic Categorization Evolution of Plastic Sensory-motor Coupling and Dynamic Categorization Gentaro Morimoto and Takashi Ikegami Graduate School of Arts and Sciences The University of Tokyo 3-8-1 Komaba, Tokyo 153-8902, Japan

More information

CSE Introduction to High-Perfomance Deep Learning ImageNet & VGG. Jihyung Kil

CSE Introduction to High-Perfomance Deep Learning ImageNet & VGG. Jihyung Kil CSE 5194.01 - Introduction to High-Perfomance Deep Learning ImageNet & VGG Jihyung Kil ImageNet Classification with Deep Convolutional Neural Networks Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton,

More information

Will now consider in detail the effects of relaxing the assumption of infinite-population size.

Will now consider in detail the effects of relaxing the assumption of infinite-population size. FINITE POPULATION SIZE: GENETIC DRIFT READING: Nielsen & Slatkin pp. 21-27 Will now consider in detail the effects of relaxing the assumption of infinite-population size. Start with an extreme case: a

More information

The significance of sensory motor functions as indicators of brain dysfunction in children

The significance of sensory motor functions as indicators of brain dysfunction in children Archives of Clinical Neuropsychology 18 (2003) 11 18 The significance of sensory motor functions as indicators of brain dysfunction in children Abstract Ralph M. Reitan, Deborah Wolfson Reitan Neuropsychology

More information

VISUAL PERCEPTION OF STRUCTURED SYMBOLS

VISUAL PERCEPTION OF STRUCTURED SYMBOLS BRUC W. HAMILL VISUAL PRCPTION OF STRUCTURD SYMBOLS A set of psychological experiments was conducted to explore the effects of stimulus structure on visual search processes. Results of the experiments,

More information

Sensation and Perception

Sensation and Perception 1 Sensation and Perception DR. ARNEL BANAGA SALGADO, Doctor of Psychology (USA) FPM (Ph.D.) Psychology (India) Doctor of Education (Phl) Master of Arts in Nursing (Phl) Master of Arts in Teaching Psychology

More information

Multi-joint limbs permit a flexible response to unpredictable events

Multi-joint limbs permit a flexible response to unpredictable events Exp Brain Res (1997) 117:148±152 Springer-Verlag 1997 RESEARCH NOTE E.M. Robertson R.C. Miall Multi-joint limbs permit a flexible response to unpredictable events Received: 24 March 1997 / Accepted: 7

More information

Sparse Coding in Sparse Winner Networks

Sparse Coding in Sparse Winner Networks Sparse Coding in Sparse Winner Networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University, Athens, OH 45701 {starzyk, yliu}@bobcat.ent.ohiou.edu

More information

PSYC20007 READINGS AND NOTES

PSYC20007 READINGS AND NOTES Week 4 Lecture 4 Attention in Space and Time The Psychological Function of Spatial Attention To assign limited-capacity processing resources to relevant stimuli in environment - Must locate stimuli among

More information

Phil 490: Consciousness and the Self Handout [16] Jesse Prinz: Mental Pointing Phenomenal Knowledge Without Concepts

Phil 490: Consciousness and the Self Handout [16] Jesse Prinz: Mental Pointing Phenomenal Knowledge Without Concepts Phil 490: Consciousness and the Self Handout [16] Jesse Prinz: Mental Pointing Phenomenal Knowledge Without Concepts Main Goals of this Paper: Professor JeeLoo Liu 1. To present an account of phenomenal

More information

Chapter 5. Optimal Foraging 2.

Chapter 5. Optimal Foraging 2. University of New Mexico Biology 310L Principles of Ecology Lab Manual Page -31 Chapter 5. Optimal Foraging 2. Today's activities: 1. Discuss Dussault et al. 2005 2. Work through the marginal value exercise

More information

Concepts About the Causes of Development: Travel, Visual Experience, and the Development of Dynamic Spatial Orientation

Concepts About the Causes of Development: Travel, Visual Experience, and the Development of Dynamic Spatial Orientation INFANCY, 1(2), 231 238 Copyright 2000, Lawrence Erlbaum Associates, Inc. Concepts About the Causes of Development: Travel, Visual Experience, and the Development of Dynamic Spatial Orientation John J.

More information

Rules of apparent motion: The shortest-path constraint: objects will take the shortest path between flashed positions.

Rules of apparent motion: The shortest-path constraint: objects will take the shortest path between flashed positions. Rules of apparent motion: The shortest-path constraint: objects will take the shortest path between flashed positions. The box interrupts the apparent motion. The box interrupts the apparent motion.

More information

Animal Behavior. Relevant Biological Disciplines. Inspirations => Models

Animal Behavior. Relevant Biological Disciplines. Inspirations => Models Animal Behavior Relevant Biological Disciplines Neuroscience: the study of the nervous system s anatomy, physiology, biochemistry and molecular biology Psychology: the study of mind and behavior Ethology:

More information

Normative Representation of Objects: Evidence for an Ecological Bias in Object Perception and Memory

Normative Representation of Objects: Evidence for an Ecological Bias in Object Perception and Memory Normative Representation of Objects: Evidence for an Ecological Bias in Object Perception and Memory Talia Konkle (tkonkle@mit.edu) Aude Oliva (oliva@mit.edu) Department of Brain and Cognitive Sciences

More information

M.Sc. in Cognitive Systems. Model Curriculum

M.Sc. in Cognitive Systems. Model Curriculum M.Sc. in Cognitive Systems Model Curriculum April 2014 Version 1.0 School of Informatics University of Skövde Sweden Contents 1 CORE COURSES...1 2 ELECTIVE COURSES...1 3 OUTLINE COURSE SYLLABI...2 Page

More information