EFFECTS OF D-AMPHETAMINE ON SIGNALED AND UNSIGNALED DELAYS TO REINFORCEMENT. Lee Davis Thomas

Size: px
Start display at page:

Download "EFFECTS OF D-AMPHETAMINE ON SIGNALED AND UNSIGNALED DELAYS TO REINFORCEMENT. Lee Davis Thomas"

Transcription

1 EFFECTS OF D-AMPHETAMINE ON SIGNALED AND UNSIGNALED DELAYS TO REINFORCEMENT Lee Davis Thomas A Thesis Submitted to the University of North Carolina Wilmington for Partial Fulfillment of the Requirements for a Degree of Masters of Arts Department of Psychology 2009 Approved by Advisory Committee Christine Hughes Carol Pilgrim Raymond Pitts, Jr. Chair Approved by Dean, Graduate School

2 TABLE OF CONTENTS ABSTRACT... iv ACKNOWLEDGEMENTS... v DEDICATION... vi LIST OF TABLES... vii LIST OF FIGURES...viii INTRODUCTION... 1 Choice and Preference... 1 Self-Control Choices... 2 Delay Discounting... 4 Psychomotor Stimulants and Self-Control... 6 The Present Study METHOD Subjects Apparatus Procedure Data Analysis RESULTS Performance During Control Conditions Performance for Saline, Control, and Drug Conditions DISCUSSION Performance During Control Conditions Effects of d-amphetamine ii

3 REFERENCES iii

4 ABSTRACT Four pigeons responded under a progressive-delay procedure. In a signaled-delay condition, a chained variable interval (VI) 30-s progressive time (PT) 4-s schedule was arranged; in an unsignaled-delay condition, a tandem VI 30-s PT 4-s schedule was arranged. Two pigeons experienced a signaled-unsignaled-signaled delay sequence; whereas, 2 pigeons experienced an unsignaled-signaled-unsignaled delay sequence. Effects of saline and d-amphetamine were determined under each condition. At intermediate doses (1.0 and 1.78 m/kg) delay functions were shallower, AUC was increased, and, when possible, break points were increased compared to saline; these effects were not systematically related to signaling conditions. These effects on control by delay often were accompanied by decreased response rates at 0 s. These results suggest that stimulus conditions associated with the delay may not play a crucial role in effects of d-amphetamine and other stimulants on behavior controlled by reinforcement delay. iv

5 ACKNOWLEDGEMENTS I give my thanks to Carol Rothstein, who was my Latin teacher in high school; she made me the student that I today through never accepting anything but the best. She will be sorely missed in the teaching profession. I also would like to thank my mentors, Dr. Raymond Pitts, Jr. and Dr. Christine Hughes, whose continual pursuit of scholastic excellence has been invaluable. They have tremendously aided in my understanding of topics in the field of Behavior Analysis. Without their support this process would have been virtually impossible. I would like to thank all of the undergraduates that spent many hours in the lab helping with the research presented in this thesis. A special thanks to go my parents and my sister. They continued to push me along even though, at times, the end appeared to never be in sight. Finally, I would like to thank my thesis committee. Thank you all for the many hours spent revising, and for the guidance during this process. v

6 DEDICATION I would like to dedicate this thesis to my mother, Rose Thomas, whose continual support through this long and arduous journey has been fathomless. She has and will always continue to inspire me to conquer all obstacles that I have and will face. vi

7 LIST OF TABLES Table Page 1. Breaks points for all pigeons, signaling conditions, and doses (1.0 and 1.78 mg/kg d-amphetamine) vii

8 LIST OF FIGURES Figure Page 1. Mean response rates plotted as a function of the nominal delay Mean responses per minute in the VI as a function of the nominal delay(s) after administration of saline (filled symbols) and 1.0 d-amphetamine (unfilled symbols) during the signaled and unsignaled conditions for each pigeon Mean responses per minute in the VI as a function of the nominal delay(s) after administration of saline (filled symbols) and 1.78 d-amphetamine (unfilled symbols) during the signaled and unsignaled conditions for each pigeon Mean area under the curve (AUC, left column) and mean response rate in the first interval of the session when the delay was 0 s (right column) calculated from sessions before which 1.0 mg/kg (white bars) and 1.78 mg/kg (striped bars) were administered in the signaled and unsignaled conditions for each pigeon Group mean area under the curve (AUC, upper panel) and mean response rate in the first interval of the session when the delay was 0 s (lower panel) calculated from sessions before which 1.0 mg/kg (white bars) and 1.78 mg/kg (striped bars) were a administered in the signaled and unsignaled conditions for each pigeon Log (drug rate/control rate) is plotted as a function of the log control rate before which 1.0 mg/kg d-amphetamine was administered in the signaled and unsignaled conditions for eachpigeon Log (drug rate/control rate) is plotted as a function of the log control rate before which 1.78 mg/kg d-amphetamine was administered in the signaled and unsignaled conditions for each pigeon viii

9 INTRODUCTION Choice and Preference Choice is ubiquitous; most behavior occurs in a context under which multiple alternatives are available. For example, students enrolled at any university may frequently have the option for studying for a test and going out with friends. Someone trying to quit smoking can smoke a cigarette or refrain from smoking. An individual may be faced with the choice of buying a fancy sports car or buying a fuel-efficient car. These examples illustrate that, conceptualized this way, much of our daily behavior involves choice. According to a behavior-analytic view, choices are determined primarily by the consequences associated with the alternatives. That is, choice involves operant behavior. Given that choice is an important component of behavior, it is understandable that it has been a major focus of experimental study over the last years (see Davison & McCarthy, 1988; devilliers, 1977; Mazur, 2006). The concurrent (Conc) schedule of reinforcement is a popular procedure to study choice in the laboratory. In a concurrent schedule, two or more operant responses are available simultaneously, and each alternative is a specific operant response associated with a contingency of reinforcement. The responses often are independent in that responding on one alternative does not affect the consequences associated with other alternatives. In a seminal study, Herrnstein (1961) exposed pigeons to concurrent variable-interval (VI) schedules in which the relative reinforcement rates were varied across alternatives. Herrnstein reported an extremely important effect: The proportion of responses allocated to a given alternative was approximately equal to the proportion of reinforcers obtained via that alternative. This effect has been replicated extensively, and holds across a variety of species, response topographies, and reinforcer types (see Davison & McCarthy, 1988). This effect can be

10 described by the mathematical formulation now known as the matching law, and is written as follows: B L /(B L +B R ) = R L /(R L +R R ). (1) In this equation, B refers to responses and R refers to reinforcers, and the L and R subscripts indicate the left- and right-hand alternatives, respectively. This formulation is relatively straightforward. For example, it states that, if 80% of the total reinforcers were obtained via the left alternative and 20% of the total reinforcers were obtained via the right alternative, then approximately 80% and 20% of the responses would be allocated to the left and right alternatives, respectively. The matching equation provides a quantitative description of behavioral allocation under multi-operant conditions. Self-Control Choices Sometimes individuals are faced with choices that involve conflicting consequences. Fried foods taste extremely good, but eating them could have long-term adverse consequences. Spending money now may lead to the acquisition of goods; however, not saving money could have unfavorable effects down the road. Taking certain drugs may result in reinforcing effects, but repeated use of drugs can lead to harmful future consequences. It is important to note that, in these examples, there likely are several factors that affect which option will be ultimately chosen. Nevertheless, they appear to have one key similarity. These situations involve a choice between an alternative that delivers something immediately and a choice that provides something better in the long run. That is, perhaps these sorts of choices can be described as good now or better later. Rachlin (1974) and Ainslie (1974) suggested that one key variable that affects choices is the temporal aspect associated with each alternative. That is, the fact that the better option is delayed decreases the likelihood of choosing that option. Choices for the good now option are 2

11 described as being impulsive. They are labeled as impulsive because typically they deliver something immediately, but at a cost in the long-term. Choices of the alternative that delivers something better later could be characterized as self-control choices. These choices deliver consequences at a later point in time. The long-term effects of selecting these types of alternatives, however, could be extremely positive. Rachlin and Green (1972) studied effects of the availability of commitment responses on self-control choices. They provided pigeons with the opportunity to emit a response that would commit them to the larger, more delayed reinforcer (i.e., this response eliminated option providing the smaller, more immediate reinforcer). The study included a two-choice procedure in which selection of one alternative delivered the larger reinforcer after a fixed delay; however, responses on this key also eliminated the opportunity to the select the smaller, more immediate reinforcer. In addition, selection of the alternative that provided the larger reinforcer could only be made during the beginning of each trial. Selection of the other key produced a delay after which a choice situation was presented. The choice situation included the availability of a larger, more delayed reinforcer and a smaller, more immediate reinforcer. They found that subjects consistently emitted the commitment response if the opportunity was provided well in advance of the choice point between the larger and smaller reinforcers. That is, if the overall delay to reinforcement for both alternatives were relatively short, then subjects were more likely to choose the smaller more immediate reinforcer. However, if the overall delay to reinforcement was relatively long for both alternatives, then subjects were more likely choose the larger more delayed reinforcer. 3

12 Delay Discounting As mentioned above, it appears that an important factor producing impulsive choices is the longer delay associated with the larger reinforcer. Delay discounting describes how the effectiveness of a reinforcer is controlled by the delay between the choice and the delivery of the reinforcer. A great deal of attention has been devoted to elucidating the form of the function relating the effectiveness of a reinforcer (its value ) to its delay. Mazur (1987, 1988), and others (e.g., Richards, Mitchell, de Wit, & Seiden, 1997), have presented substantial evidence suggesting that a hyperbolic function provides the best description of this relation. Mazur s (1987) hyperbolic equation is as follows: V = A/ (1 + kd) (2), where V refers to the value, or effectiveness, of the reinforcer, A refers to the amount of the reinforcer, D refers to the delay associated with the reinforcer, and k is a fitted parameter that characterizes the rate at which the function decreases to asymptote; k typically is referred to as the discounting parameter. The higher the k value, the faster the function reaches asymptote (i.e., the higher the value of k, the more the delay discounts reinforcer value). Delay discounting can be affected by several factors such as past experiences with delay to reinforcement (Mazur & Logue, 1978), and administration of psychoactive substances (Charrier & Thiebot, 1996; de Wit, Enggasser, & Richards, 2002; Evenden & Ryan, 1996; Logue, 1992; Pietras, Cherek, Lane, Tcheremissine, & Steinberg, 2003; Pitts & Febbo, 2004; Pitts & McKinney, 2005; Richards et al, 1997; Richards, Sabol, & de Witt, 1999; Wade, de Witt, & Richards, 2000). Mazur and Logue (1978) demonstrated that reinforcement history can alter effects of reinforcement delay. The researchers used two groups of subjects (an experimental and a control group); both groups of subjects chose between a smaller and larger reinforcer amount. The 4

13 experimental group initially had a 6-s delay for both the smaller and the larger reinforcers, but the delay for the smaller reinforcer was faded (i.e., gradually reduced) to 0-s over the course of several sessions. The control group had a 0-s delay for the smaller reinforcer and a 5.5-s delay for the larger reinforcer (i.e., the delay for the smaller reinforcer was not faded). Following the fading procedure, subjects in the experimental condition chose the larger more delayed reinforcer at substantially higher rates compared to the subjects in the control condition when delay to the smaller reinforcer was 0 s. This finding illustrates that delay-discounting and, hence, preference for the larger more delayed reinforcer, can be altered by reinforcement history. Recently, investigators have been interested in studying effects of drugs on self-control choices (Charrier & Thiebot, 1996; de Wit, et al., 2002; Evenden & Ryan, 1996; Logue, 1992; Pietras et al., 2003; Pitts & Febbo, 2004; Pitts & McKinney, 2005; Richards et al, 1997; Richards, et al., 1999; Wade, et al., 2000). In particular, psychomotor stimulants have received considerable attention, likely due in part to the clinical prescription of these drugs to treat individuals with attention-deficit disorder and attention-deficit-hyperactive disorder (Greenhill, 2001; Murray & Kollins, 2000). A diagnostic characteristic of these disorders is impulsive behavior patterns. Therefore, it is exceedingly important that behavioral patterns modified by drugs and the behavioral mechanisms that affect these behavioral patterns be studied in the laboratory. Psychomotor Stimulants and Self-Control Early research on effects of psychomotor stimulants on self-control choices indicated that these drugs increased preference for the smaller, more immediate reinforcer (e.g., increased impulsive choices). For example, Evenden and Ryan (1996) used a discrete-trials choice procedure in which the delay to the larger reinforcer escalated across blocks of trials. The 5

14 subjects were presented with 8 or 12 trials per block; each block had a different delay value associated with the larger reinforcer, but the delay to the smaller reinforcer was fixed at 0 s for the entire session. The delay to the larger reinforcer increased from 0 s to 60 s across blocks. Choosing the larger alternative yielded five pellets of food and choosing the smaller alternative yielded one pellet of food. Within each session, choice of the larger alternative decreased as a function of the increasing delays. As the delay to reinforcement increased, subjects were more likely to choose the smaller, more immediate alternative. During control conditions, when the delays were relatively short, subjects preferred the larger more delayed reinforcer; however, as the delay increased across the session the subject s preference switched towards the smaller more immediate reinforcer, which shifted the function to the left. Moderate doses of d-amphetamine increased choice for the smaller more immediate reinforcer. That is, the function relating reinforcer effectiveness to delay was shifted to the left. Similarly, Charrier and Thiebot (1996) also reported that d-amphetamine increased choices of a smaller more immediate reinforcer. In contrast to the data reported by Evenden and Ryan (1996) and Charrier and Thiebot (1996), later research (Pietras et al., 2003; Pitts & Febbo, 2004; Pitts & McKinney, 2005; Richards et al, 1997; Richards, et al., 1999; Wade, et al., 2000) demonstrated that psychomotor stimulants increased choices of a larger more delayed reinforcer. For example, Richards et al. (1999) used an adjusting-amount procedure to examine effects of methamphetamine on selfcontrol choices. Their adjusting-amount procedure consisted of two alternatives. One alternative (the standard) provided a larger reinforcer presented after a fixed delay. The other (adjusting) alternative provided an immediate reinforcer, the size of which adjusted as a function of the subjects choices. Each choice of the standard alternative increased the amount of reinforcement for the adjusting alternative by 10% on the next trial, whereas each choice of the 6

15 adjusting alternative decreased its amount by 10% on the next trial. Responding in this procedure yielded an indifference point for each subject, which was defined as the amount of reinforcement on the adjusting alternative when the subjects chose each option approximately 50% of the time. It should be noted that the indifference points are the result of the interaction between reinforcement magnitude and delay. The indifference points could be conceptualized as the point at which the reinforcer values of the alternatives are equal. In this procedure, lower indifference points are said to show impulsive behavior and higher indifference points are said to show self-control. That is, repeated selection of the smaller more immediate reinforcer drives the indifference point down and repeated selection of the larger more delayed alternative drives the indifference point up. Richards et al. reported that moderate doses of methamphetamine increased the indifference points compared to baseline. That is, at these doses, subjects selected the larger more delayed alternative to a much greater degree, which increased the indifference point. These data suggest that the administration of methamphetamine increased the effectiveness of the larger more delayed reinforcer (see also Wade et al., 2000). Pitts and McKinney (2005) used a procedure similar to Evenden and Ryan s (1996) within-session, escalating-delay procedure to test effects of methyphenidate on self-control choice. Pitts and McKinney s procedure included five blocks of trials in which the delay to the larger reinforcer escalated across blocks. Each block consisted of two initial forced choice trials and five free-choice trials. Subjects selected between two alternatives with the delay to the smaller reinforcer held constant at 0 s and the delay to the larger reinforcer increased from 0 s to 50 s. Selection of the larger, more delayed reinforcer initiated a signaled delay; the delay was signaled by the blinking of the light above the lever associated with the larger more delayed alternative. During control conditions, subjects selected the larger reinforcer alterative almost 7

16 exclusively when the delays were equal, and as the delays increased subjects were more likely to select the smaller more immediate reinforcer. Pitts and McKinney reported that, in most cases, intermediate doses of methylphenidate increased selection of the larger more delay alternative compared to baseline responding. That is, they found that the function relating reinforcer effectiveness to delay was shifted to the right. Pitts and Febbo (2004) exposed 4 pigeons to a concurrent chains procedure with escalating delays across blocks. During the initial links, two alternatives (red or greens keys) were provided simultaneously and a single random-interval (RI) schedule of 60 s controlled contact with the terminal links. The houselight and side keys (either red or green) were illuminated during the initial link. Once the initial link schedule was satisfied, the houselight and side keys were turned off, and the active key color was re-illuminated on the houselight. One terminal link provided 1.5-s access to reinforcement (grain) delayed by a fixed time (FT) of 2 s. The other terminal link provided 4.5-s access to reinforcement delayed by an increasing FT that ranged from 2 to 40 s. There were five, 10-min blocks. For each subject, intermediate doses of methamphetamine reduced sensitivity to the delay. That is, at these doses, subjects chose the larger more delayed reinforcer even as delays were relatively large. A similar effect has been replicated with humans (de Wit et al., 2002; Pietras, et al., 2003). Why are there discrepancies between the effects of psychomotor stimulants on selfcontrol choices across the earlier and later studies? It is unclear at this point why earlier studies indicated that psychomotor stimulants decreased the self-control choices ( Charrier & Thiebot, 1996; Evenden & Ryan, 1996; Logue, et al. 1992), but later studies indicted that psychomotor stimulants increased self-control choices (Pietras et al., 2003; Pitts & Febbo, 2004; Pitts & McKinney, 2005; Richards et al., 1999; Wade et al., 2000). Several procedural differences 8

17 across these studies have been noted (see Pitts & McKinney, 2005; Richards et al., 1999). One conspicuous difference has to do with the signaling conditions associated with the delay. Most of the early studies did not include an explicit and unique stimulus associated with the delay to the larger reinforcer. That is, the same stimulus conditions were present during both the delay and the inter-trial interval. In contrast, virtually all of the more recent studies, in which stimulants increased self-control choices, provided explicit, and unique, stimulus conditions during the delay to the larger reinforcer. This suggests that the differential results between earlier and later studies could be due to the presence or absence of a unique stimulus associated with the delay to the larger reinforcer. Noting the differential signaling conditions across the earlier and later studies, Cardinal, Robbins and Everitt (2000) examined effects of signaled delays of reinforcement. Their experimental procedure was similar to the one used by Evenden and Ryan (1996); the smaller reinforcer (1 food pellet) was presented immediately, and the delay to the larger reinforcer (5 food pellets) increased within each session across blocks of trials. Each choice retracted the levers and extinguished the houselight, however, there were two different signaling conditions associated with the delay to the larger reinforcer. In the cue condition, the stimulus light above the lever associated with the larger reinforcer was illuminated during the entire duration of the delay. In the no-cue condition, the stimulus light above the lever was not illuminated during the delay to the larger reinforcer. That is, in the no-cue stimulus conditions (blackout) during the delay were the same as those during the inter-trial interval, whereas during the cue condition a delay-correlated stimulus was presented following a response. They reported that moderate doses of d-amphetamine increased choices of the larger more delay reinforcer in the cue condition. That is, they reported that the delay functions in the cue condition shifted slightly to 9

18 right. The opposite effect was found for d-amphetamine in the no-cue condition; the drug decreased choices of the smaller, more immediate reinforcer. That is, they reported that the delay functions in the no-cue condition shifted to the left which was indicative of greater delay discounting. These findings suggested that effects of the d-amphetamine were modulated by the particular signaling conditions associated with the delay to the larger reinforcer. One interpretation of the results provided by Cardinal et al. (2000) is that psychomotor stimulants increased the efficacy of the conditioned reinforcer associated with the delay. In theory, conditioned, or secondary reinforcers acquire the capacity to strengthen behavior by being paired with primary reinforcers (Hill, 1970; Robbins, 1975). Cardinal et al. postulated that signals present during reinforcement delay served as conditioned reinforcers, and that d- amphetamine increased choices of the larger more delayed reinforcer in their cue condition by enhancing the conditioned-reinforcing effectiveness of the signal This interpretation is supported by previous studies investigating effects of stimulant drugs on conditioned reinforcement (Branch & Walker, 1996; Files, Branch, & Clody, 1989; Hill, 1970; Robbins, 1975). For example, Files et al. (1989) assessed effects of methylphenidate on responding when a brief stimulus complex was presented with the presentation of a primary reinforcer and during standard operant extinction (i.e., the brief stimulus complex was presented without the presentation of a primary reinforcer). They used a second-order, random-ratio (RR) 2 (VI 30-s) schedule, in which completion of each VI 30-s resulted in presentation of a brief stimulus and completion of an average of two VI 30-s components resulted in food presentation. This schedule was in effect during the first portion of each session, and food extinction was in effect during the second portion. Two extinction conditions (during which no food was presented) were used during the second portion of each session; these two conditions were presented 10

19 irregularly across experimental sessions. One condition consisted of standard food extinction; this means that responses had no scheduled consequences. In the other food-extinction condition, the brief stimulus complex that was previously paired with food delivery was presented according to a VI 30-s schedule. After stable performance was established in each condition, effects of saline and several doses of methylphenidate were determined. Methylphenidate increased response rates during the second portion of the session for both extinction conditions, but rates were increased to a greater extent with the presentation of the brief stimulus compared to the standard extinction condition. These data suggest that methylphenidate increased the conditioned reinforcing effectiveness of the paired brief stimuli. Although an account of the Cardinal et al. s (2000) data in terms of terms of conditioned reinforcement is a viable one, it should be noted that there are some potential limitations of the Cardinal et al. study. At this point, theirs is the only experiment demonstrating differential effects of the signaling conditions on self-control choices. Moreover, the effects of d- amphetamine in both the cue and no-cue conditions were extremely small. At the 20-s delay in the cue condition, only one dose (0.3 mg/kg) demonstrated an increase in selection of the larger, more delayed reinforcer when compared to baseline responding. That is, the shift of the functions, in either direction, were not particularly impressive considering that Cardinal et al. s conclusions were based entirely on group data. Finally, in their cue condition, the presence or absence of an explicit cue was confounded with the presence or absence of a delay. That is, there was no delay associated with the smaller reinforcer and, thus, there was no opportunity to provide a unique signal for this reinforcer. Thus, further study seems needed on the modulation of the effects of d-amphetamine by the signaling conditions associated with reinforcement delay. 11

20 The Present Study Reilly and Lattal (2004) developed a potentially useful procedure to assess effects of drugs on behavior maintained by delayed reinforcement and the modulating effect of delayassociated stimuli. They used a progressive-delay procedure to generate within-session delay gradients. In this procedure, responding is reinforced on a single schedule, and the delay to reinforcement increases by a fixed duration following each reinforcer presentation. The progressive nature of the procedure allowed the researchers to assess how a large range of delays affects behavior. Delay gradients were obtained by plotting response rates within each session as a function of the progressively increasing delay; the gradients typically showed a decreasing, hyperbolic-like function. In their first experiment, Reilly and Lattal (2004) used a progressive-delay procedure that included a VI or a fixed-interval (FI) schedule during the initial link, and a delay that progressively increased by a fixed duration in the terminal link. That is, responding was reinforced according to a tandem VI (or FI) 30-s, progressive time (PT) 2-s schedule of food presentation. Key pecks that satisfied the interval schedule initiated an unsignaled, non-resetting delay that began at 0 s and increased by 2 s following each reinforcer presentation. When a subject failed to respond for 7 min, the session was terminated, and the delay value in effect was considered the break point. On alternate sessions, yoked-interval schedules were arranged based upon the inter-food intervals from the immediately preceding progressive-delay session. Response rates under the delay condition decreased across the session to a greater extent than under the yoked-interval schedule. Thus, Reilly and Lattal were able to isolate the effects of reinforcement delay, independent of the effects of reinforcement rate. 12

21 In their second experiment, Reilly and Lattal (2004) used the progressive-delay procedure to compare effects of signaled and unsignaled delays to reinforcement. Under the signaled-delay condition, a chained VI 30-s, PT 2-s schedule was arranged and the delays were signaled by a blackout. When the subject pecked the center key after an average of 30 s, the houselight and keylight were extinguished and a blackout occurred for the entire delay interval. Under the unsignaled-delay condition, a tandem VI 30-s, PT 2-s was arranged. When the subject pecked the center key after an average of 30 s, the delay began, but the houselight and keylight remained illuminated until the delay was over. Under both conditions, the delay to reinforcement increased by 2 s following each reinforcer presentation. Signaled and unsignaled delays to reinforcer were examined using an ABA design; the order of conditions (signaled-unsignaled-signaled, or unsignaled-signaled-unsignaled) was counterbalanced across subjects. In addition, delay gradients were assessed by plotting response rates at each delay. Delay gradients were analyzed with a modified version of Mazur s (1987) hyperbolic function: B = B I /(1 + kd) (3), where B denotes the rate of response at a given delay (D), and B I indicates the rate of responding at the 0-s delay. Reilly and Lattal found that the break points and response rates were higher for the signaled condition. In addition, they also reported that lower k values for the signaled condition, which is indicative of greater behavioral maintenance. These, and other, findings suggest that the presence of the signal during the delay increases response rates maintained by delayed reinforcement. Using a similar procedure to Reilly and Lattal (2004) could be an extremely effective method to examine the effects of drugs on behavior maintained by signaled and unsignaled delays to reinforcement. The manner in which the delay progressively increases within each 13

22 session allows for determining how drugs affect the entire delay function efficiently. Furthermore, self-control procedures involve differences in both the delay and magnitude of reinforcerment. The present procedure may provide a method to assess drug effects on delayed reinforcement without being confounded by magnitude. The purpose of the present study was to compare effects of the stimulant d-amphetamine on behavior maintained by signaled and unsignaled delays to reinforcement. Similar procedures to those described by Reilly and Lattal (2004) were used to generate within-session delay gradients under signaled (chained) and unsignaled (tandem) conditions. Several predictions could be made based upon the previous literature. For instance, if the data are consistent with Reilly and Lattal (2004), then we could expect greater behavioral maintenance (i.e., more shallow delay gradients and higher break points) during the signaled condition. In contrast, we would expect the opposite effect (i.e., steeper delay gradients and lower break points) during the unsignaled condition. In addition, if Cardinal et al. s (2000) data are due to a unique signal presented during the delay, then we would expect d-amphetamine to produce an increase of AUC and a shift of the function to the right during the signaled condition; however, during the unsignaled condition, we would predict a decrease of AUC and a shift of the function to the left. If the effect is not due to signaling, then for both conditions we would expect d-amphetamine to produce comparable changes to the AUC and similar shifts of the function. Method Subjects Four male White Carneau pigeons were used (75, 358, 1871, 1985). Two of the pigeons (75, 358) were experimentally naïve; whereas, the other two pigeons (1871, 1985) had 14

23 previously experienced choice procedures that included drug regimens. The pigeons were kept at approximately 80% of their free feeding weight via post-session feeding as needed. All of the subjects had free access to water and health grit. The subjects were housed individually in a vivarium under a 12:12 light/dark cycle; the Vivarium s lights were extinguished at 7 p.m. and illuminated at 7 a.m. Apparatus Four operant-conditioning chambers (BRS-LVE model SEC-002), each with interior dimensions of 35.0 cm deep, 30.5 cm wide and 36 cm high, were used. Two of the chambers were equipped with three keys on the right wall adjacent to the entrance of the chamber. The keys were 8.5 cm apart (center to center) and could be illuminated red, yellow, or green. The keys were located 26 cm from the floor of the chamber and each side key was located 9 cm from its adjacent wall; only the center key was used. The shielded houselight used a 1.2 watt bulb and was located 6.5 cm above the center key. Located 5 cm to the left of the houselight was a green stimulus light and located 5 cm to the right of the houselight was a red stimulus light; these lights were not used during the experiment. The other two chambers were identical to those just described, except they contained only a single key (centered) and did not contain the green and red stimulus lights. Each peck on the illuminated center key that exceeded a force of 0.25 N was counted as a response. A food hopper containing mixed grain was accessed via a 5 by 6 cm opening which was located 11 cm directly underneath the center key. Reinforcement consisted of 4-s presentations of the hopper, during which the opening was illuminated and the keylight and houselight were extinguished. Ambient white noise was delivered by 4 speakers placed in the corners of the room. Each chamber contained a ventilation fan. Experimental events were controlled and data were collected by a computer running MED-PC IV and connected to MED 15

24 Associates interfacing; the computer and interface equipment were located in an adjoining room. Procedure After Pigeons 358 and 75 were habituated to the operant chambers and magazine trained, pecking the yellow center key was shaped by differentially reinforcing successive approximations. The other two pigeons (1985, 1871) were not experimentally naïve, so they were placed directly on an FR 1 schedule. For all pigeons, once pecking occurred reliably on the FR 1, the response requirement was raised to FR 5. After 3 to 5 sessions at FR 5, a VI 5-s schedule was arranged. The value of the VI was raised gradually across sessions to 30 s. Once the training phase was completed, the pigeons were assigned to one of two experimental conditions such that each condition (signaled or unsignaled delays) included a naïve and an experienced pigeon. The reinforcement contingencies in the signaled condition were a chained VI 30-s PT 4-s schedule. The cycle began with the illumination of the houselight and the yellow keylight. The first keypeck after an average of 30 s turned off the houselight and keylight and initiated a progressively increasing, non-resetting delay. After the delay was over, reinforcement was presented. The first reinforcer of each session was presented immediately (0-s delay), after which the delay to reinforcement increased in increments of 4 s. That is, the delay to the second reinforcer was 4 s, the delay to the third reinforcer was 8 s, and so on. Keypecks during the delay were counted, but had no programmed consequences. The reinforcement contingencies in the unsignaled condition were a tandem VI 30-s PT 4-s schedule. The first keypeck after an average of 30 s initiated a progressively increasing, nonresetting unsignaled delay. The houselight and keylight remain illuminated until the delay was 16

25 over, after which the houselight light extinguished, and reinforcement was presented. The delay value increased within each session in the same fashion as described for the signaled condition. Similar to the signaled condition, keypecks during the delay had no programmed consequences. In both conditions, the experimental sessions were terminated when a pigeon failed to peck the center key for 5 min (300 s), or the session exceeded a time limit of 120 min. The delay value in effect at the time the session ended was considered the break point. Once stable performance was reached under the progressive-delay procedure, drug testing began. For any block of 10 sessions, behavior was considered stable when the average breakpoints for the first 5 sessions and the last 5 session of the block differed by less than 10% of the mean of all 10 sessions. During the drug regimen, injections were usually administered on Tuesdays and Fridays. Data from days that directly preceded injections served as the no-injection control data. Injections were made intramuscularly in the pigeon s breast area; injection sites alternated between the left and right side of the sternum. Pre-treatment time was 15 min, during which the pigeons were placed into their homecages. The 15-min pretreatment time allowed the substance to be absorbed into the organism s system. Physiological saline was used as the injection vehicle. Dose-effect curves were determined for each of the subjects. The drug regimen included administration of 0.3, 1.0, 1.78, 3.0, and 5.6 mg/kg d-amphetamine sulfate. This range included a dose that produced little or no effect on overall response rates and a dose that completely or nearly eliminated responding. The curves consisted of at least two injections of each dose for each individual pigeon. The order of exposure to the doses was mixed, with the constraint that no dose was given a second time before all doses were given once. For each pigeon, once all 17

26 doses and saline had been given twice, one or two doses ( effective doses) were determined through visual inspection of the data. A dose was considered an effective dose if it affected the break point and/or the shape of the delay gradient, without dramatically suppressing overall response rates. For all pigeons the effective doses were 1.0 and 1.78 mg/kg. For Pigeons 1871 and 75, 1.0 and 1.78 mg/kg were administered 3 additional times to establish the reliability of their effects; however, for Pigeon 358, 1.0 mg/kg was not tested any additional times and for Pigeon 1985, 1.0 mg/kg was only tested 1 additional time. The reason for not testing 1.0 mg/kg more times for Pigeons 358 and 1985 was this dose had little or no effect on the shape of the delay gradients. After the above drug regimen was completed, the condition for each pigeon was reversed. The pigeons (358 and 1985) in the unsignaled condition were placed in the signaled condition and vice versa. Once stable performance was achieved following the reversal, the same drug regimen described for the initial conditions were also tested again and effective were also re-determined. For all pigeons during the second phase, 1.0 and 1.78 mg/kg were determined to be effective doses, and these doses were administered 1-3 additional times. After the drug regimen was completed during the second phase, the pigeons were returned to the original condition (reversal). After stability was achieved during the return to baseline condition, saline and the effective doses that were determined during the previous phases (1.0 and 1.78 mg/kg) were administered 5 times each. Thus, under this ABA experimental design, two pigeons (1871 and 75) experienced a signaled-unsignaled-signaled sequence, and the other two pigeons (358 and 1985) experienced an unsignaled-signaled-unsignaled sequence. Data Analysis Response rates were expressed as responses per minute. Response rates were calculated by dividing the total number of responses within each interval (i.e., at each delay) by the total 18

27 time spent in each VI 30-s interval. Thus, for each session, a delay-of-reinforcement gradient was constructed by plotting response rate during the VI as a function of delay. In addition, mean response rates at 0-s delays were obtained by averaging the response rate at 0-s for each determination of saline and d-amphetamine. Break points were obtained for each experimental session. The break point was the delay value when the session was terminated, either via the time limit or when no responses occurred for 300 s (5 min). To characterize the effects of d-amphetamine on the function relating response rate to reinforcement delay, the area under the curve (AUC) was analyzed (Myerson, Green, and Warusawitharana, 2004). The area under the curves was calculated using: (x 2 - x 1 ) [(y 2 - y 1 ) / 2] (6). The x 1 values are for any given delay and x 2 values represent the next delay in the sequence. The delays (x-values) were normalized by representing each delay as a proportion of the maximum delay (i.e., 196 s). That is, each delay was divided by the maximum delay. The y 1 values were derived from the mean response rates for a given delay and the y 2 values were the mean response rates for the next delay in the sequence. Response rates (y-values) were normalized by representing the mean response rate at each delay as a proportion of the mean response rate at 0-s delay. That is, the mean response rate at each delay was divided by the mean response rate at the 0-s delay. Trapezoids were created by the series of successive delay values and the area for each trapezoid was measured. Finally, the total area of all trapezoids was summated. The AUC values obtained via Equation 6 were used to create dose-effect curves. A drug induced increase in the AUC would indicate an attenuated effect of the delay (i.e., greater delay discounting), 19

28 whereas, a decrease in the AUC would indicate an attenuated effect of the delay (i.e., less delay discounting). Rate-dependency analyses also were performed on the mean response rates after administration of 1.0 and 1.78 mg/kg of d-amphetamine with respect to responding after administration of saline. Mean saline response rates were transformed to log 10, and the equation for calculation of rate-dependency was as follows: log 10 (mean RR of a given dose/ mean RR at saline) (7). Once these data were obtained via equation 7, they were plotted as a function of mean saline response rates transformed to log 10. A linear regression line was fitted to the data. Results Performance During Control Conditions Figure 1 shows mean responses per minute during control sessions for each pigeon as a function of the nominal (programmed delay) under the signaled and unsignaled phases of the experiment. Under both signaled and unsignaled conditions, response rates were a negatively decelerating function of delay. The functions for the signaled condition, however, were much shallower and reached asymptote less quickly than those for the unsignaled condition. That is, behavior was better maintained by delayed reinforcement under the signaled condition than under the unsignaled condition. Greater behavioral maintenance was also indicated by higher break points when subjects were in the signaled phase. In 5 out of 6 instances in the signaled phase, the maximum break point of 196 s was reached. In contrast, in the unsignaled condition, none of the subjects reached the maximum break point (i.e., all subjects failed to reach the 196 s delay) (see Table 1). In general, control performance was recovered in the return to the baseline conditions for all subjects. That is, for the most part, when the baseline conditions 20

29 Signaled Unsignaled Signaled Signaled 75 - Unsignaled 75 - Signaled Mean Responses Per Min Unsignaled Signaled Unsignaled Unsignaled Signaled Unsignaled Nominal Delay Figure 1. Mean response rates for control days are plotted as a function of the nominal delay. 21

30 Phase 1 Phase 2 Phase 3 Pigeon Signaled Unsignaled Signaled 1871 Saline 196 a 33.6 (20-48) 188 ( ) 1.0 mg/kg (60-196) mg/kg (72-176) Saline (24-188) mg/kg (72-196) mg/kg (32-196) 196 Unsignaled Signaled Unsignaled 358 Saline 20.8 (12-28) 75.2 (60-92) 28.2 (12-40) 1.0 mg/kg 42.0 (40-44) b (44-188) 1.78 mg/kg 84.0 (16-152) (36-184) 1985 Saline 41.6 (16-80) (88-196) 35.2 (20-48) 1.0 mg/kg 45.3 (16-84) b (16-72) 1.78 mg/kg 39.2 (12-124) (20-48) a No range after a mean indicates that for all sessions the mean break point was the same. b In this condition, 358 and 1985 received 2 and 3 administrations of 1.0 mg/kg, respectively. Table 1. Mean (range) break points (in seconds) reached by each pigeon in each condition from all sessions before which saline, 1.0 mg/kg, or 1.78 mg/kg was administered. Means are from five sessions, unless indicated otherwise. 22

31 were reinstated, the delay functions and break points were similar to those previously obtained in the signaled condition (Pigeons 1871 and 75) and in the unsignaled condition (Pigeons 358 and 1985). In some cases, there appeared to be some slight carry-over effects. For example, break points in the signaled condition for the 2 pigeons who started in the unsignaled condition were lower (M = s) than those for the 2 pigeons who started in the signaled condition (M = 196 s). Performance during Saline, Control, and Drug Conditions Figure 2 shows mean responses per minute after administration of saline (closed circles) and 1.0 mg/kg d-amphetamine (open circles) as a function of the nominal delay for each condition. In addition, the curves were extended to the mean break points for each condition for both Figure 2 and 3. Although several doses of d-amphetamine were tested, most of the analyses presented will focus on effects of the 1.0 and 1.78 mg/kg doses. The lowest dose (0.3 mg/kg) did not produce a reliable effect on performance, and the higher doses (3.0 mg/kg and 5.6 mg/kg) typically suppressed overall response rates substantially. For Pigeons 1871 and 75, who experienced a signaled unsignaled signaled sequence, the delay functions during the initial signaled condition shifted upwards after injections of 1.0 mg/kg when compared to saline. At delays over 50 s, response rates during the initial signaled condition increased compared to saline. At shorter delays, response rates either were relatively unchanged (1871) or slighted decreased (75). The delay functions for Pigeons 1871 and 75, during the unsignaled condition, shifted upwards (i.e., higher response rates) and to the right (i.e., higher break points) following administrations of 1.0 mg/kg. Break points were 4.62 and 1.38 times greater after administrations of 1.0 mg/kg compared to saline for Pigeon 1871 and 75, respectively. Note that 23

32 Signaled Unsignaled Saline 1.0 mg/kg Signaled Signaled 75 - Unsignaled 75 - Signaled Mean Responses Per Minute Unsignaled Signaled Unsignaled Unsignaled Signaled Unsignaled Nominal Delay (s) Figure 2. Mean responses per minute in the VI as a function of the nominal delay (s) after administration of saline (filled symbols) and 1.0 d-amphetamine (unfilled symbols) during the signaled and unsignaled conditions for each pigeon. Each function is the mean of data from five sessions and extends to the mean break point. Note that the y-axes differ across pigeons. 24

33 for Pigeon 1871, during the unsignaled condition, 1.0 mg/kg produced a very large shift of the function to the right. Indeed, this dose increased the break point from 33.6 s to s. Although 1.0 mg/kg slightly increased response rates at several delays, for Pigeon 358, during the unsignaled condition, response rates remained relatively low at delays higher than 12 s. For Pigeons 1871 and 75, 1.0 mg/kg shifted the function upwards, and breaks point remained the maximum during the initial signaled condition. During the return to the signaled condition, however, response rates for these two pigeons largely were unaffected by this dose. Specifically, response rates at delays greater than 50 s were elevated following administration of 1.0 mg/kg during the first, but not during the second, exposure to the signaled condition. Thus, the effects of the drug that were produced during the initial signaled condition were not completely recovered during the return to the signaled phase. For Pigeons 358 and 1985 (who experienced an unsignaled signaled unsignaled sequence) the delay functions shifted slightly to the right following administration of 1.0 mg/kg during the initial unsignaled condition. In addition, this dose decreased response rates at the 0 s delay for both pigeons. The break points for Pigeons 358 and 1985 during the initial unsignaled conditions were 2.02 and 1.09 times greater after administrations of 1.0 mg/kg, respectively. For Pigeons 358 and 1985 during the signaled condition, the delay functions shifted upwards and to the right when possible (Pigeon 358). For Pigeon 358 the break points were 2.61 times higher following administration of 1.0 mg/kg than compared to saline. Interestingly, 1.0 mg/kg produced rather large increases for both break points and response rates during the signaled conditions for Pigeon 358. The delay function for Pigeon 1985 during the return to the unsignaled condition was similar to the function during the initial unsignaled condition (i.e., low mean break point, 25

Quantitative analyses of methamphetamine s effects on self-control choices: implications for elucidating behavioral mechanisms of drug action

Quantitative analyses of methamphetamine s effects on self-control choices: implications for elucidating behavioral mechanisms of drug action Behavioural Processes 66 (2004) 213 233 Quantitative analyses of methamphetamine s effects on self-control choices: implications for elucidating behavioral mechanisms of drug action Raymond C. Pitts, Stacy

More information

Examining the Constant Difference Effect in a Concurrent Chains Procedure

Examining the Constant Difference Effect in a Concurrent Chains Procedure University of Wisconsin Milwaukee UWM Digital Commons Theses and Dissertations May 2015 Examining the Constant Difference Effect in a Concurrent Chains Procedure Carrie Suzanne Prentice University of Wisconsin-Milwaukee

More information

PREFERENCE REVERSALS WITH FOOD AND WATER REINFORCERS IN RATS LEONARD GREEN AND SARA J. ESTLE V /V (A /A )(D /D ), (1)

PREFERENCE REVERSALS WITH FOOD AND WATER REINFORCERS IN RATS LEONARD GREEN AND SARA J. ESTLE V /V (A /A )(D /D ), (1) JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 23, 79, 233 242 NUMBER 2(MARCH) PREFERENCE REVERSALS WITH FOOD AND WATER REINFORCERS IN RATS LEONARD GREEN AND SARA J. ESTLE WASHINGTON UNIVERSITY Rats

More information

on both components of conc Fl Fl schedules, c and a were again less than 1.0. FI schedule when these were arranged concurrently.

on both components of conc Fl Fl schedules, c and a were again less than 1.0. FI schedule when these were arranged concurrently. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1975, 24, 191-197 NUMBER 2 (SEPTEMBER) PERFORMANCE IN CONCURRENT INTERVAL SCHEDULES: A SYSTEMATIC REPLICATION' BRENDA LOBB AND M. C. DAVISON UNIVERSITY

More information

STIMULUS FUNCTIONS IN TOKEN-REINFORCEMENT SCHEDULES CHRISTOPHER E. BULLOCK

STIMULUS FUNCTIONS IN TOKEN-REINFORCEMENT SCHEDULES CHRISTOPHER E. BULLOCK STIMULUS FUNCTIONS IN TOKEN-REINFORCEMENT SCHEDULES By CHRISTOPHER E. BULLOCK A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR

More information

d-amphetamine S EFFECTS ON BEHAVIOR PUNISHED BY TIME-OUT FROM POSITIVE REINFORCEMENT Emily E. Guido

d-amphetamine S EFFECTS ON BEHAVIOR PUNISHED BY TIME-OUT FROM POSITIVE REINFORCEMENT Emily E. Guido d-amphetamine S EFFECTS ON BEHAVIOR PUNISHED BY TIME-OUT FROM POSITIVE REINFORCEMENT Emily E. Guido A Thesis Submitted to the University of North Carolina Wilmington in Partial Fulfillment of the Requirements

More information

CAROL 0. ECKERMAN UNIVERSITY OF NORTH CAROLINA. in which stimulus control developed was studied; of subjects differing in the probability value

CAROL 0. ECKERMAN UNIVERSITY OF NORTH CAROLINA. in which stimulus control developed was studied; of subjects differing in the probability value JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1969, 12, 551-559 NUMBER 4 (JULY) PROBABILITY OF REINFORCEMENT AND THE DEVELOPMENT OF STIMULUS CONTROL' CAROL 0. ECKERMAN UNIVERSITY OF NORTH CAROLINA Pigeons

More information

KEY PECKING IN PIGEONS PRODUCED BY PAIRING KEYLIGHT WITH INACCESSIBLE GRAIN'

KEY PECKING IN PIGEONS PRODUCED BY PAIRING KEYLIGHT WITH INACCESSIBLE GRAIN' JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1975, 23, 199-206 NUMBER 2 (march) KEY PECKING IN PIGEONS PRODUCED BY PAIRING KEYLIGHT WITH INACCESSIBLE GRAIN' THOMAS R. ZENTALL AND DAVID E. HOGAN UNIVERSITY

More information

ECONOMIC AND BIOLOGICAL INFLUENCES ON KEY PECKING AND TREADLE PRESSING IN PIGEONS LEONARD GREEN AND DANIEL D. HOLT

ECONOMIC AND BIOLOGICAL INFLUENCES ON KEY PECKING AND TREADLE PRESSING IN PIGEONS LEONARD GREEN AND DANIEL D. HOLT JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2003, 80, 43 58 NUMBER 1(JULY) ECONOMIC AND BIOLOGICAL INFLUENCES ON KEY PECKING AND TREADLE PRESSING IN PIGEONS LEONARD GREEN AND DANIEL D. HOLT WASHINGTON

More information

INTERACTIONS AMONG UNIT PRICE, FIXED-RATIO VALUE, AND DOSING REGIMEN IN DETERMINING EFFECTS OF REPEATED COCAINE ADMINISTRATION

INTERACTIONS AMONG UNIT PRICE, FIXED-RATIO VALUE, AND DOSING REGIMEN IN DETERMINING EFFECTS OF REPEATED COCAINE ADMINISTRATION INTERACTIONS AMONG UNIT PRICE, FIXED-RATIO VALUE, AND DOSING REGIMEN IN DETERMINING EFFECTS OF REPEATED COCAINE ADMINISTRATION By JIN HO YOON A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY

More information

Within-event learning contributes to value transfer in simultaneous instrumental discriminations by pigeons

Within-event learning contributes to value transfer in simultaneous instrumental discriminations by pigeons Animal Learning & Behavior 1999, 27 (2), 206-210 Within-event learning contributes to value transfer in simultaneous instrumental discriminations by pigeons BRIGETTE R. DORRANCE and THOMAS R. ZENTALL University

More information

Sequences of Fixed-Ratio Schedules of Reinforcement: The Effect of Ratio Size in the Second and Third Fixed-Ratio on Pigeons' Choice

Sequences of Fixed-Ratio Schedules of Reinforcement: The Effect of Ratio Size in the Second and Third Fixed-Ratio on Pigeons' Choice Western Michigan University ScholarWorks at WMU Dissertations Graduate College 12-1991 Sequences of Fixed-Ratio Schedules of Reinforcement: The Effect of Ratio Size in the Second and Third Fixed-Ratio

More information

STUDIES OF WHEEL-RUNNING REINFORCEMENT: PARAMETERS OF HERRNSTEIN S (1970) RESPONSE-STRENGTH EQUATION VARY WITH SCHEDULE ORDER TERRY W.

STUDIES OF WHEEL-RUNNING REINFORCEMENT: PARAMETERS OF HERRNSTEIN S (1970) RESPONSE-STRENGTH EQUATION VARY WITH SCHEDULE ORDER TERRY W. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2000, 73, 319 331 NUMBER 3(MAY) STUDIES OF WHEEL-RUNNING REINFORCEMENT: PARAMETERS OF HERRNSTEIN S (1970) RESPONSE-STRENGTH EQUATION VARY WITH SCHEDULE

More information

Overview. Simple Schedules of Reinforcement. Important Features of Combined Schedules of Reinforcement. Combined Schedules of Reinforcement BEHP 1016

Overview. Simple Schedules of Reinforcement. Important Features of Combined Schedules of Reinforcement. Combined Schedules of Reinforcement BEHP 1016 BEHP 1016 Why People Often Make Bad Choices and What to Do About It: Important Features of Combined Schedules of Reinforcement F. Charles Mace, Ph.D., BCBA-D University of Southern Maine with Jose Martinez-Diaz,

More information

ON THE EFFECTS OF EXTENDED SAMPLE-OBSERVING RESPONSE REQUIREMENTS ON ADJUSTED DELAY IN A TITRATING DELAY MATCHING-TO-SAMPLE PROCEDURE WITH PIGEONS

ON THE EFFECTS OF EXTENDED SAMPLE-OBSERVING RESPONSE REQUIREMENTS ON ADJUSTED DELAY IN A TITRATING DELAY MATCHING-TO-SAMPLE PROCEDURE WITH PIGEONS ON THE EFFECTS OF EXTENDED SAMPLE-OBSERVING RESPONSE REQUIREMENTS ON ADJUSTED DELAY IN A TITRATING DELAY MATCHING-TO-SAMPLE PROCEDURE WITH PIGEONS Brian D. Kangas, B.A. Thesis Prepared for the Degree of

More information

Value transfer in a simultaneous discrimination by pigeons: The value of the S + is not specific to the simultaneous discrimination context

Value transfer in a simultaneous discrimination by pigeons: The value of the S + is not specific to the simultaneous discrimination context Animal Learning & Behavior 1998, 26 (3), 257 263 Value transfer in a simultaneous discrimination by pigeons: The value of the S + is not specific to the simultaneous discrimination context BRIGETTE R.

More information

UNIVERSITY OF NORTH CAROLINA CHAPEL HILL AND UNIVERSITY OF NORTH CAROLINA WILMINGTON

UNIVERSITY OF NORTH CAROLINA CHAPEL HILL AND UNIVERSITY OF NORTH CAROLINA WILMINGTON JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2005, 83, 281 296 NUMBER 3(MAY) MORPHINE TOLERANCE AS A FUNCTION OF RATIO SCHEDULE: RESPONSE REQUIREMENT OR UNIT PRICE? CHRISTINE E. HUGHES, STACEY C. SIGMON,

More information

Stimulus control of foodcup approach following fixed ratio reinforcement*

Stimulus control of foodcup approach following fixed ratio reinforcement* Animal Learning & Behavior 1974, Vol. 2,No. 2, 148-152 Stimulus control of foodcup approach following fixed ratio reinforcement* RICHARD B. DAY and JOHN R. PLATT McMaster University, Hamilton, Ontario,

More information

Processing of empty and filled time intervals in pigeons

Processing of empty and filled time intervals in pigeons Learning & Behavior 2004, 32 (4), 477-490 Processing of empty and filled time intervals in pigeons DOUGLAS S. GRANT and DIANE C. TALARICO University of Alberta, Edmonton, Alberta, Canada Pigeons were trained

More information

THE EFFECTS OF TERMINAL-LINK STIMULUS ARRANGEMENTS ON PREFERENCE IN CONCURRENT CHAINS. LAUREL COLTON and JAY MOORE University of Wisconsin-Milwaukee

THE EFFECTS OF TERMINAL-LINK STIMULUS ARRANGEMENTS ON PREFERENCE IN CONCURRENT CHAINS. LAUREL COLTON and JAY MOORE University of Wisconsin-Milwaukee The Psychological Record, 1997,47,145-166 THE EFFECTS OF TERMINAL-LINK STIMULUS ARRANGEMENTS ON PREFERENCE IN CONCURRENT CHAINS LAUREL COLTON and JAY MOORE University of Wisconsin-Milwaukee Pigeons served

More information

The generality of within-session patterns of responding: Rate of reinforcement and session length

The generality of within-session patterns of responding: Rate of reinforcement and session length Animal Learning & Behavior 1994, 22 (3), 252-266 The generality of within-session patterns of responding: Rate of reinforcement and session length FRANCES K. MCSWEENEY, JOHN M. ROLL, and CARI B. CANNON

More information

Discounting in pigeons when the choice is between two delayed rewards: implications for species comparisons

Discounting in pigeons when the choice is between two delayed rewards: implications for species comparisons Original Research Article published: 17 August 2011 doi: 10.3389/fnins.2011.00096 Discounting in pigeons when the choice is between two delayed rewards: implications for species comparisons Amanda L. Calvert,

More information

OBSERVING AND ATTENDING IN A DELAYED MATCHING-TO-SAMPLE PREPARATION IN PIGEONS. Bryan S. Lovelace, B.S. Thesis Prepared for the Degree of

OBSERVING AND ATTENDING IN A DELAYED MATCHING-TO-SAMPLE PREPARATION IN PIGEONS. Bryan S. Lovelace, B.S. Thesis Prepared for the Degree of OBSERVING AND ATTENDING IN A DELAYED MATCHING-TO-SAMPLE PREPARATION IN PIGEONS Bryan S. Lovelace, B.S. Thesis Prepared for the Degree of MASTER OF SCIENCE UNIVERSITY OF NORTH TEXAS December 2008 APPROVED:

More information

Behavioural Processes

Behavioural Processes Behavioural Processes 89 (2012) 212 218 Contents lists available at SciVerse ScienceDirect Behavioural Processes j o ur nal homep age : www.elsevier.com/locate/behavproc Providing a reinforcement history

More information

STEPHEN P. KRAMER. (Kojima, 1980; Lattal, 1975; Maki, Moe, &

STEPHEN P. KRAMER. (Kojima, 1980; Lattal, 1975; Maki, Moe, & JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR1 1982, 38, 71-85 NUMBER I (JULY) MEMORY FOR RECENT BEHAVIOR IN THE PIGEON STEPHEN P. KRAMER CORRECTIONS DIVISION, DRAPER, UTAH Variations of the symbolic

More information

The Development of Context-specific Operant Sensitization to d-amphetamine

The Development of Context-specific Operant Sensitization to d-amphetamine Utah State University DigitalCommons@USU All Graduate Theses and Dissertations Graduate Studies 5-009 The Development of Context-specific Operant Sensitization to d-amphetamine Wesley Paul Thomas Utah

More information

Sum of responding as a function of sum of reinforcement on two-key concurrent schedules

Sum of responding as a function of sum of reinforcement on two-key concurrent schedules Animal Learning & Behavior 1977, 5 (1),11-114 Sum of responding as a function of sum of reinforcement on two-key concurrent schedules FRANCES K. McSWEENEY Washington State University, Pul/man, Washington

More information

Travel Distance and Stimulus Duration on Observing Responses by Rats

Travel Distance and Stimulus Duration on Observing Responses by Rats EUROPEAN JOURNAL OF BEHAVIOR ANALYSIS 2010, 11, 79-91 NUMBER 1 (SUMMER 2010) 79 Travel Distance and Stimulus Duration on Observing Responses by Rats Rogelio Escobar National Autonomous University of Mexico

More information

THE SEVERAL ROLES OF STIMULI IN TOKEN REINFORCEMENT CHRISTOPHER E. BULLOCK

THE SEVERAL ROLES OF STIMULI IN TOKEN REINFORCEMENT CHRISTOPHER E. BULLOCK JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2015, 103, 269 287 NUMBER 2 (MARCH) THE SEVERAL ROLES OF STIMULI IN TOKEN REINFORCEMENT CHRISTOPHER E. BULLOCK UNIVERSITY OF FLORIDA AND TIMOTHY D. HACKENBERG

More information

Operant response topographies of rats receiving food or water reinforcers on FR or FI reinforcement schedules

Operant response topographies of rats receiving food or water reinforcers on FR or FI reinforcement schedules Animal Learning& Behavior 1981,9 (3),406-410 Operant response topographies of rats receiving food or water reinforcers on FR or FI reinforcement schedules JOHN H. HULL, TIMOTHY J. BARTLETT, and ROBERT

More information

RESPONSE-INDEPENDENT CONDITIONED REINFORCEMENT IN AN OBSERVING PROCEDURE

RESPONSE-INDEPENDENT CONDITIONED REINFORCEMENT IN AN OBSERVING PROCEDURE RESPONSE-INDEPENDENT CONDITIONED REINFORCEMENT IN AN OBSERVING PROCEDURE By ANTHONY L. DEFULIO A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE

More information

GENERALIZED IDENTITY MATCHING-TO-SAMPLE IN RATS USING OLFACTORY STIMULI. Tracy M. Peña

GENERALIZED IDENTITY MATCHING-TO-SAMPLE IN RATS USING OLFACTORY STIMULI. Tracy M. Peña GENERALIZED IDENTITY MATCHING-TO-SAMPLE IN RATS USING OLFACTORY STIMULI Tracy M. Peña A Thesis Submitted to the University of North Carolina at Wilmington in Partial Fulfillment Of the Requirements for

More information

The effect of sample duration and cue on a double temporal discrimination q

The effect of sample duration and cue on a double temporal discrimination q Available online at www.sciencedirect.com Learning and Motivation 39 (2008) 71 94 www.elsevier.com/locate/l&m The effect of sample duration and cue on a double temporal discrimination q Luís Oliveira,

More information

between successive DMTS choice phases.

between successive DMTS choice phases. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1996, 66, 231 242 NUMBER 2(SEPTEMBER) SEPARATING THE EFFECTS OF TRIAL-SPECIFIC AND AVERAGE SAMPLE-STIMULUS DURATION IN DELAYED MATCHING TO SAMPLE IN PIGEONS

More information

PIGEONS CHOICES BETWEEN FIXED-RATIO AND LINEAR OR GEOMETRIC ESCALATING SCHEDULES PAUL NEUMAN, WILLIAM H. AHEARN, AND PHILIP N.

PIGEONS CHOICES BETWEEN FIXED-RATIO AND LINEAR OR GEOMETRIC ESCALATING SCHEDULES PAUL NEUMAN, WILLIAM H. AHEARN, AND PHILIP N. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2000, 73, 93 102 NUMBER 1(JANUARY) PIGEONS CHOICES BETWEEN FIXED-RATIO AND LINEAR OR GEOMETRIC ESCALATING SCHEDULES PAUL NEUMAN, WILLIAM H. AHEARN, AND

More information

PREFERENCE FOR FIXED-INTERVAL SCHEDULES: AN ALTERNATIVE MODEL'

PREFERENCE FOR FIXED-INTERVAL SCHEDULES: AN ALTERNATIVE MODEL' JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1973, 2, 393-43 NUMBER 3 (NOVEMBER) PREFERENCE FOR FIXED-INTERVAL SCHEDULES: AN ALTERNATIVE MODEL' M. C. DAVISON AND W. TEMPLE UNIVERSITY OF AUCKLAND AND

More information

Reinforcer Magnitude and Resistance to Change of Forgetting Functions and Response Rates

Reinforcer Magnitude and Resistance to Change of Forgetting Functions and Response Rates Utah State University DigitalCommons@USU All Graduate Theses and Dissertations Graduate Studies 8-2012 Reinforcer Magnitude and Resistance to Change of Forgetting Functions and Response Rates Meredith

More information

Schedule Induced Polydipsia: Effects of Inter-Food Interval on Access to Water as a Reinforcer

Schedule Induced Polydipsia: Effects of Inter-Food Interval on Access to Water as a Reinforcer Western Michigan University ScholarWorks at WMU Master's Theses Graduate College 8-1974 Schedule Induced Polydipsia: Effects of Inter-Food Interval on Access to Water as a Reinforcer Richard H. Weiss Western

More information

Jennifer J. McComas and Ellie C. Hartman. Angel Jimenez

Jennifer J. McComas and Ellie C. Hartman. Angel Jimenez The Psychological Record, 28, 58, 57 528 Some Effects of Magnitude of Reinforcement on Persistence of Responding Jennifer J. McComas and Ellie C. Hartman The University of Minnesota Angel Jimenez The University

More information

Concurrent schedule responding as a function ofbody weight

Concurrent schedule responding as a function ofbody weight Animal Learning & Behavior 1975, Vol. 3 (3), 264-270 Concurrent schedule responding as a function ofbody weight FRANCES K. McSWEENEY Washington State University, Pullman, Washington 99163 Five pigeons

More information

UNIVERSITY OF WALES SWANSEA AND WEST VIRGINIA UNIVERSITY

UNIVERSITY OF WALES SWANSEA AND WEST VIRGINIA UNIVERSITY JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 05, 3, 3 45 NUMBER (JANUARY) WITHIN-SUBJECT TESTING OF THE SIGNALED-REINFORCEMENT EFFECT ON OPERANT RESPONDING AS MEASURED BY RESPONSE RATE AND RESISTANCE

More information

MOUNT ALLISON UNIVERSITY

MOUNT ALLISON UNIVERSITY JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 00, 79, 6 NUMBER (MARCH) RESPONDING FOR SUCROSE AND WHEEL-RUNNING REINFORCEMENT: EFFECTS OF SUCROSE CONCENTRATION AND WHEEL-RUNNING REINFORCER DURATION

More information

ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE

ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE BROWN UNIVERSITY 1967, 10,

More information

FIXED-RATIO PUNISHMENT1 N. H. AZRIN,2 W. C. HOLZ,2 AND D. F. HAKE3

FIXED-RATIO PUNISHMENT1 N. H. AZRIN,2 W. C. HOLZ,2 AND D. F. HAKE3 JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR VOLUME 6, NUMBER 2 APRIL, 1963 FIXED-RATIO PUNISHMENT1 N. H. AZRIN,2 W. C. HOLZ,2 AND D. F. HAKE3 Responses were maintained by a variable-interval schedule

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5871/1849/dc1 Supporting Online Material for Rule Learning by Rats Robin A. Murphy,* Esther Mondragón, Victoria A. Murphy This PDF file includes: *To whom correspondence

More information

EFFECTS OF INTERRESPONSE-TIME SHAPING ON MULTIPLE SCHEDULE PERFORMANCE. RAFAEL BEJARANO University of Kansas

EFFECTS OF INTERRESPONSE-TIME SHAPING ON MULTIPLE SCHEDULE PERFORMANCE. RAFAEL BEJARANO University of Kansas The Psychological Record, 2004, 54, 479-490 EFFECTS OF INTERRESPONSE-TIME SHAPING ON MULTIPLE SCHEDULE PERFORMANCE RAFAEL BEJARANO University of Kansas The experiment reported herein was conducted to determine

More information

The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand).

The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). http://researchcommons.waikato.ac.nz/ Research Commons at the University of Waikato Copyright Statement: The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). The thesis

More information

PROBABILITY OF SHOCK IN THE PRESENCE AND ABSENCE OF CS IN FEAR CONDITIONING 1

PROBABILITY OF SHOCK IN THE PRESENCE AND ABSENCE OF CS IN FEAR CONDITIONING 1 Journal of Comparative and Physiological Psychology 1968, Vol. 66, No. I, 1-5 PROBABILITY OF SHOCK IN THE PRESENCE AND ABSENCE OF CS IN FEAR CONDITIONING 1 ROBERT A. RESCORLA Yale University 2 experiments

More information

RESPONSE PERSISTENCE UNDER RATIO AND INTERVAL REINFORCEMENT SCHEDULES KENNON A. LATTAL, MARK P. REILLY, AND JAMES P. KOHN

RESPONSE PERSISTENCE UNDER RATIO AND INTERVAL REINFORCEMENT SCHEDULES KENNON A. LATTAL, MARK P. REILLY, AND JAMES P. KOHN JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1998, 70, 165 183 NUMBER 2(SEPTEMBER) RESPONSE PERSISTENCE UNDER RATIO AND INTERVAL REINFORCEMENT SCHEDULES KENNON A. LATTAL, MARK P. REILLY, AND JAMES

More information

Transitive inference in pigeons: Control for differential value transfer

Transitive inference in pigeons: Control for differential value transfer Psychonomic Bulletin & Review 1997, 4 (1), 113-117 Transitive inference in pigeons: Control for differential value transfer JANICE E. WEAVER University of Kentucky, Lexington, Kentucky JANICE N. STEIRN

More information

Pigeons Choose to Gamble in a Categorical Discrimination Task

Pigeons Choose to Gamble in a Categorical Discrimination Task Analysis of Gambling Behavior Volume 11 Article 2 2017 Pigeons Choose to Gamble in a Categorical Discrimination Task Nathaniel C. Rice 1. United States Army Medical Research Institute of Chemical Defense,

More information

Observing behavior: Redundant stimuli and time since information

Observing behavior: Redundant stimuli and time since information Animal Learning & Behavior 1978,6 (4),380-384 Copyright 1978 by The Psychonornic Society, nc. Observing behavior: Redundant stimuli and time since information BRUCE A. WALD Utah State University, Logan,

More information

THE SUNK COST EFFECT WITH PIGEONS: SOME DETERMINANTS OF DECISIONS ABOUT PERSISTENCE ANNE C. MACASKILL

THE SUNK COST EFFECT WITH PIGEONS: SOME DETERMINANTS OF DECISIONS ABOUT PERSISTENCE ANNE C. MACASKILL JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2012, 97, 85 100 NUMBER 1(JANUARY) THE SUNK COST EFFECT WITH PIGEONS: SOME DETERMINANTS OF DECISIONS ABOUT PERSISTENCE ANNE C. MACASKILL UNIVERSITY OF FLORIDA

More information

EVALUATIONS OF DELAYED REINFORCEMENT IN CHILDREN WITH DEVELOPMENTAL DISABILITIES

EVALUATIONS OF DELAYED REINFORCEMENT IN CHILDREN WITH DEVELOPMENTAL DISABILITIES EVALUATIONS OF DELAYED REINFORCEMENT IN CHILDREN WITH DEVELOPMENTAL DISABILITIES By JOLENE RACHEL SY A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT

More information

INTRODUCING NEW STIMULI IN FADING

INTRODUCING NEW STIMULI IN FADING JOURNL OF THE EXPERMENTL NLYSS OF BEHVOR 1979, 32, 121-127 NUMBER (JULY) CQUSTON OF STMULUS CONTROL WHLE NTRODUCNG NEW STMUL N FDNG LNNY FELDS THE COLLEGE OF STTEN SLND fter establishing a discrimination

More information

Evidence for a Magnitude Effect in Temporal Discounting With Pigeons

Evidence for a Magnitude Effect in Temporal Discounting With Pigeons Journal of Experimental Psychology: Animal Behavior Processes 2012, Vol. 38, No. 1, 102 108 2012 American Psychological Association 0097-7403/12/$12.00 DOI: 10.1037/a0026345 BRIEF REPORT Evidence for a

More information

Interference in pigeons' long-term memory viewed as a retrieval problem

Interference in pigeons' long-term memory viewed as a retrieval problem Animal Learning & Behavior 1981,9 (4),581-586 Interference in pigeons' long-term memory viewed as a retrieval problem DAVID R. THOMAS, ALAN R. McKELVIE, MICHAEL RANNEY, and THOMAS B. MOYE University ofcolorado,

More information

USING A SELF-CONTROL TRAINING PROCEDURE TO INCREASE APPROPRIATE BEHAVIOR MARK R. DIXON AND LINDA J. HAYES

USING A SELF-CONTROL TRAINING PROCEDURE TO INCREASE APPROPRIATE BEHAVIOR MARK R. DIXON AND LINDA J. HAYES JOURNAL OF APPLIED BEHAVIOR ANALYSIS 1998, 31, 203 210 NUMBER 2(SUMMER 1998) USING A SELF-CONTROL TRAINING PROCEDURE TO INCREASE APPROPRIATE BEHAVIOR MARK R. DIXON AND LINDA J. HAYES UNIVERSITY OF NEVADA

More information

DISCRIMINATION IN RATS OSAKA CITY UNIVERSITY. to emit the response in question. Within this. in the way of presenting the enabling stimulus.

DISCRIMINATION IN RATS OSAKA CITY UNIVERSITY. to emit the response in question. Within this. in the way of presenting the enabling stimulus. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR EFFECTS OF DISCRETE-TRIAL AND FREE-OPERANT PROCEDURES ON THE ACQUISITION AND MAINTENANCE OF SUCCESSIVE DISCRIMINATION IN RATS SHIN HACHIYA AND MASATO ITO

More information

The effects of two different states of food deprivation for 6 roosters was measured with a

The effects of two different states of food deprivation for 6 roosters was measured with a Effects of Food Deprivation on Memory Performance. Jacob. L. Kerewaro Abstract The effects of two different states of food deprivation for 6 roosters was measured with a delayed matching-to-sample (DMTS)

More information

REINFORCEMENT AT CONSTANT RELATIVE IMMEDIACY OF REINFORCEMENT A THESIS. Presented to. The Faculty of the Division of Graduate. Studies and Research

REINFORCEMENT AT CONSTANT RELATIVE IMMEDIACY OF REINFORCEMENT A THESIS. Presented to. The Faculty of the Division of Graduate. Studies and Research TWO-KEY CONCURRENT RESPONDING: CHOICE AND DELAYS OF REINFORCEMENT AT CONSTANT RELATIVE IMMEDIACY OF REINFORCEMENT A THESIS Presented to The Faculty of the Division of Graduate Studies and Research By George

More information

Instrumental Conditioning I

Instrumental Conditioning I Instrumental Conditioning I Basic Procedures and Processes Instrumental or Operant Conditioning? These terms both refer to learned changes in behavior that occur as a result of the consequences of the

More information

NIH Public Access Author Manuscript Learn Behav. Author manuscript; available in PMC 2010 February 26.

NIH Public Access Author Manuscript Learn Behav. Author manuscript; available in PMC 2010 February 26. NIH Public Access Author Manuscript Published in final edited form as: Learn Behav. 2009 November ; 37(4): 357 364. doi:10.3758/lb.37.4.357. Behavioral Momentum and Relapse of Extinguished Operant Responding

More information

A PROCEDURE TO TEACH SELF-CONTROL TO CHILDREN WITH ATTENTION DEFICIT HYPERACTIVITY DISORDER LISA M. BINDER AND MARK R. DIXON PATRICK M.

A PROCEDURE TO TEACH SELF-CONTROL TO CHILDREN WITH ATTENTION DEFICIT HYPERACTIVITY DISORDER LISA M. BINDER AND MARK R. DIXON PATRICK M. JOURNAL OF APPLIED BEHAVIOR ANALYSIS 2000, 33, 233 237 NUMBER 2(SUMMER 2000) A PROCEDURE TO TEACH SELF-CONTROL TO CHILDREN WITH ATTENTION DEFICIT HYPERACTIVITY DISORDER LISA M. BINDER AND MARK R. DIXON

More information

Signaled reinforcement effects on fixed-interval performance of rats with lever depressing or releasing as a target response 1

Signaled reinforcement effects on fixed-interval performance of rats with lever depressing or releasing as a target response 1 Japanese Psychological Research 1998, Volume 40, No. 2, 104 110 Short Report Signaled reinforcement effects on fixed-interval performance of rats with lever depressing or releasing as a target response

More information

REINFORCEMENT OF PROBE RESPONSES AND ACQUISITION OF STIMULUS CONTROL IN FADING PROCEDURES

REINFORCEMENT OF PROBE RESPONSES AND ACQUISITION OF STIMULUS CONTROL IN FADING PROCEDURES JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1985, 439 235-241 NUMBER 2 (MARCH) REINFORCEMENT OF PROBE RESPONSES AND ACQUISITION OF STIMULUS CONTROL IN FADING PROCEDURES LANNY FIELDS THE COLLEGE OF

More information

Schedules of Reinforcement

Schedules of Reinforcement Schedules of Reinforcement MACE, PRATT, ZANGRILLO & STEEGE (2011) FISHER, PIAZZA & ROANE CH 4 Rules that describe how will be reinforced are 1. Every response gets SR+ ( ) vs where each response gets 0

More information

A Memory Model for Decision Processes in Pigeons

A Memory Model for Decision Processes in Pigeons From M. L. Commons, R.J. Herrnstein, & A.R. Wagner (Eds.). 1983. Quantitative Analyses of Behavior: Discrimination Processes. Cambridge, MA: Ballinger (Vol. IV, Chapter 1, pages 3-19). A Memory Model for

More information

Comparison of Direct and Indirect Reinforcement Contingencies on Task Acquisition. A Thesis Presented. Robert Mark Grant

Comparison of Direct and Indirect Reinforcement Contingencies on Task Acquisition. A Thesis Presented. Robert Mark Grant Comparison of Direct and Indirect Reinforcement Contingencies on Task Acquisition A Thesis Presented By Robert Mark Grant In partial fulfillment of the requirements for the degree of Master of Science

More information

Magazine approach during a signal for food depends on Pavlovian, not instrumental, conditioning.

Magazine approach during a signal for food depends on Pavlovian, not instrumental, conditioning. In Journal of Experimental Psychology: Animal Behavior Processes http://www.apa.org/pubs/journals/xan/index.aspx 2013, vol. 39 (2), pp 107 116 2013 American Psychological Association DOI: 10.1037/a0031315

More information

Contextual Control of Chained Instrumental Behaviors

Contextual Control of Chained Instrumental Behaviors Journal of Experimental Psychology: Animal Learning and Cognition 2016 American Psychological Association 2016, Vol. 42, No. 4, 401 414 2329-8456/16/$12.00 http://dx.doi.org/10.1037/xan0000112 Contextual

More information

PSYC2010: Brain and Behaviour

PSYC2010: Brain and Behaviour PSYC2010: Brain and Behaviour PSYC2010 Notes Textbook used Week 1-3: Bouton, M.E. (2016). Learning and Behavior: A Contemporary Synthesis. 2nd Ed. Sinauer Week 4-6: Rieger, E. (Ed.) (2014) Abnormal Psychology:

More information

DOES THE TEMPORAL PLACEMENT OF FOOD-PELLET REINFORCEMENT ALTER INDUCTION WHEN RATS RESPOND ON A THREE-COMPONENT MULTIPLE SCHEDULE?

DOES THE TEMPORAL PLACEMENT OF FOOD-PELLET REINFORCEMENT ALTER INDUCTION WHEN RATS RESPOND ON A THREE-COMPONENT MULTIPLE SCHEDULE? The Psychological Record, 2004, 54, 319-332 DOES THE TEMPORAL PLACEMENT OF FOOD-PELLET REINFORCEMENT ALTER INDUCTION WHEN RATS RESPOND ON A THREE-COMPONENT MULTIPLE SCHEDULE? JEFFREY N. WEATHERLY, KELSEY

More information

SUBSTITUTION EFFECTS IN A GENERALIZED TOKEN ECONOMY WITH PIGEONS LEONARDO F. ANDRADE 1 AND TIMOTHY D. HACKENBERG 2

SUBSTITUTION EFFECTS IN A GENERALIZED TOKEN ECONOMY WITH PIGEONS LEONARDO F. ANDRADE 1 AND TIMOTHY D. HACKENBERG 2 JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 217, 17, 123 135 NUMBER 1 (JANUARY) SUBSTITUTION EFFECTS IN A GENERALIZED TOKEN ECONOMY WITH PIGEONS LEONARDO F. ANDRADE 1 AND TIMOTHY D. HACKENBERG 2 1

More information

Pigeons memory for time: Assessment of the role of subjective shortening in the duration-comparison procedure

Pigeons memory for time: Assessment of the role of subjective shortening in the duration-comparison procedure Learning & Behavior 2009, 37 (1), 74-84 doi:10.3758/lb.37.1.74 Pigeons memory for time: Assessment of the role of subjective shortening in the duration-comparison procedure PATRICK VAN ROOYEN AND ANGELO

More information

CONCURRENT CHAINS UNIVERSITY-IMPERIAL VALLEY CAMPUS. links differ with respect to the percentage of

CONCURRENT CHAINS UNIVERSITY-IMPERIAL VALLEY CAMPUS. links differ with respect to the percentage of JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1987, 47, 57-72 CHOICE BETWEEN RELIABLE AND UNRELIABLE OUTCOMES: MIXED PERCENTAGE-REINFORCEMENT IN CONCURRENT CHAINS MARCIA L. SPETCH AND ROGER DUNN DALHOUSIE

More information

PURSUING THE PAVLOVIAN CONTRIBUTIONS TO INDUCTION IN RATS RESPONDING FOR 1% SUCROSE REINFORCEMENT

PURSUING THE PAVLOVIAN CONTRIBUTIONS TO INDUCTION IN RATS RESPONDING FOR 1% SUCROSE REINFORCEMENT The Psychological Record, 2007, 57, 577 592 PURSUING THE PAVLOVIAN CONTRIBUTIONS TO INDUCTION IN RATS RESPONDING FOR 1% SUCROSE REINFORCEMENT JEFFREY N. WEATHERLY, AMBER HULS, and ASHLEY KULLAND University

More information

Revista Mexicana de Análisis de la Conducta ISSN: Sociedad Mexicana de Análisis de la Conducta.

Revista Mexicana de Análisis de la Conducta ISSN: Sociedad Mexicana de Análisis de la Conducta. Revista Mexicana de Análisis de la Conducta ISSN: 0185-4534 editora@rmac-mx.org Sociedad Mexicana de Análisis de la Conducta México LATTAL, KENNON A.; SMITH, JULIE M. BEHAVIORAL CONTRAST WHEN RESPONSES

More information

Remembering: The role of extraneous reinforcement

Remembering: The role of extraneous reinforcement Learning & Behavior 2005, 33 (3), 309-323 Remembering: The role of extraneous reinforcement GLENN S. BROWN and K. GEOFFREY WHITE University of Otago, Dunedin, New Zealand In two experiments, pigeons responding

More information

VERNON L. QUINSEY DALHOUSIE UNIVERSITY. in the two conditions. If this were possible, well understood where the criterion response is

VERNON L. QUINSEY DALHOUSIE UNIVERSITY. in the two conditions. If this were possible, well understood where the criterion response is JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR LICK-SHOCK CONTINGENCIES IN THE RATT1 VERNON L. QUINSEY DALHOUSIE UNIVERSITY 1972, 17, 119-125 NUMBER I (JANUARY) Hungry rats were allowed to lick an 8%

More information

Effects of Increased Exposure to Training Trials with Children with Autism. A Thesis Presented. Melissa A. Ezold

Effects of Increased Exposure to Training Trials with Children with Autism. A Thesis Presented. Melissa A. Ezold Effects of Increased Exposure to Training Trials with Children with Autism A Thesis Presented by Melissa A. Ezold The Department of Counseling and Applied Educational Psychology In partial fulfillment

More information

Delayed Matching-To-Sample Test in Macaques

Delayed Matching-To-Sample Test in Macaques C O N F I D E N T I A L Delayed Matching-To-Sample Test in Macaques DATE This study was conducted under the terms of a Materials Transfer and Services Agreement between NeuroDetective International and

More information

Testing the Functional Equivalence of Retention Intervals and Sample-Stimulus Disparity in Conditional Discrimination

Testing the Functional Equivalence of Retention Intervals and Sample-Stimulus Disparity in Conditional Discrimination Utah State University DigitalCommons@USU All Graduate Theses and Dissertations Graduate Studies 5-2008 Testing the Functional Equivalence of Retention Intervals and Sample-Stimulus Disparity in Conditional

More information

Describing Naturally Occurring Schedules: Analysis of Feedback Functions for Shooting During Basketball Games. A Thesis Presented

Describing Naturally Occurring Schedules: Analysis of Feedback Functions for Shooting During Basketball Games. A Thesis Presented Describing Naturally Occurring Schedules: Analysis of Feedback Functions for Shooting During Basketball Games A Thesis Presented by Nicholas R. Vanselow The Department of Counseling and Applied Educational

More information

Some Parameters of the Second-Order Conditioning of Fear in Rats

Some Parameters of the Second-Order Conditioning of Fear in Rats University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Papers in Behavior and Biological Sciences Papers in the Biological Sciences 1969 Some Parameters of the Second-Order Conditioning

More information

CONDITIONED REINFORCEMENT IN RATS'

CONDITIONED REINFORCEMENT IN RATS' JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1969, 12, 261-268 NUMBER 2 (MARCH) CONCURRENT SCHEULES OF PRIMARY AN CONITIONE REINFORCEMENT IN RATS' ONAL W. ZIMMERMAN CARLETON UNIVERSITY Rats responded

More information

Excerpt from LABORATORY MANUAL PRINCIPLES OF PSYCHOLOGY: EXPERIMENTAL FOUNDATIONS PSYCHOLOGY

Excerpt from LABORATORY MANUAL PRINCIPLES OF PSYCHOLOGY: EXPERIMENTAL FOUNDATIONS PSYCHOLOGY Excerpt from LABORATORY MANUAL PRINCIPLES OF PSYCHOLOGY: EXPERIMENTAL FOUNDATIONS PSYCHOLOGY 122 2001 Participating Faculty Professor James Dickson (dickson@stolaf.edu) Professor Dana Gross (grossd@stolaf.edu)

More information

Role of the anterior cingulate cortex in the control over behaviour by Pavlovian conditioned stimuli

Role of the anterior cingulate cortex in the control over behaviour by Pavlovian conditioned stimuli Role of the anterior cingulate cortex in the control over behaviour by Pavlovian conditioned stimuli in rats RN Cardinal, JA Parkinson, H Djafari Marbini, AJ Toner, TW Robbins, BJ Everitt Departments of

More information

Unit 6 Learning.

Unit 6 Learning. Unit 6 Learning https://www.apstudynotes.org/psychology/outlines/chapter-6-learning/ 1. Overview 1. Learning 1. A long lasting change in behavior resulting from experience 2. Classical Conditioning 1.

More information

Operant matching. Sebastian Seung 9.29 Lecture 6: February 24, 2004

Operant matching. Sebastian Seung 9.29 Lecture 6: February 24, 2004 MIT Department of Brain and Cognitive Sciences 9.29J, Spring 2004 - Introduction to Computational Neuroscience Instructor: Professor Sebastian Seung Operant matching Sebastian Seung 9.29 Lecture 6: February

More information

Effects of a Novel Fentanyl Derivative on Drug Discrimination and Learning in Rhesus Monkeys

Effects of a Novel Fentanyl Derivative on Drug Discrimination and Learning in Rhesus Monkeys PII S0091-3057(99)00058-1 Pharmacology Biochemistry and Behavior, Vol. 64, No. 2, pp. 367 371, 1999 1999 Elsevier Science Inc. Printed in the USA. All rights reserved 0091-3057/99/$ see front matter Effects

More information

NIH Public Access Author Manuscript J Exp Psychol Anim Behav Process. Author manuscript; available in PMC 2005 November 14.

NIH Public Access Author Manuscript J Exp Psychol Anim Behav Process. Author manuscript; available in PMC 2005 November 14. NIH Public Access Author Manuscript Published in final edited form as: J Exp Psychol Anim Behav Process. 2005 April ; 31(2): 213 225. Timing in Choice Experiments Jeremie Jozefowiez and Daniel T. Cerutti

More information

The Persistence-Strengthening Effects of DRA: An Illustration of Bidirectional Translational Research

The Persistence-Strengthening Effects of DRA: An Illustration of Bidirectional Translational Research The Behavior Analyst 2009, 32, 000 000 No. 2 (Fall) The Persistence-Strengthening Effects of DRA: An Illustration of Bidirectional Translational Research F. Charles Mace University of Southern Maine Jennifer

More information

Extinction. n Operant Extinction: n Ideally combined with DRO (esp. DRI) n No longer reinforcing operant behavioral response

Extinction. n Operant Extinction: n Ideally combined with DRO (esp. DRI) n No longer reinforcing operant behavioral response Extinction Extinction n Operant Extinction: n No longer reinforcing operant behavioral response n Ultimately reducing B s probability to zero n Ideally combined with DRO (esp. DRI) n Initial & Temporary

More information

UNIVERSITY OF IOWA AND SOUTHERN ILLINOIS UNIVERSITY AT CARBONDALE

UNIVERSITY OF IOWA AND SOUTHERN ILLINOIS UNIVERSITY AT CARBONDALE JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2002, 78, 365 373 NUMBER 3(NOVEMBER) BRIEF PRESENTATIONS ARE SUFFICIENT FOR PIGEONS TO DISCRIMINATE ARRAYS OF SAME AND DIFFERENT STIMULI EDWARD A. WASSERMAN,

More information

REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER

REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER UNIVERSITY OF IOWA 1969, 12, 451-461 NUMBER

More information

Concurrent Chains Schedules as a Method to Study Choice Between Alcohol Associated Conditioned Reinforcers

Concurrent Chains Schedules as a Method to Study Choice Between Alcohol Associated Conditioned Reinforcers Utah State University DigitalCommons@USU Psychology Faculty Publications Psychology 1-2012 Concurrent Chains Schedules as a Method to Study Choice Between Alcohol Associated Conditioned Reinforcers Corina

More information

TOLERANCE TO EFFECTS OF COCAINE ON BEHAVIOR UNDER A RESPONSE-INITIATED FIXED-INTERVAL SCHEDULE MATTHEW T. WEAVER AND MARC N.

TOLERANCE TO EFFECTS OF COCAINE ON BEHAVIOR UNDER A RESPONSE-INITIATED FIXED-INTERVAL SCHEDULE MATTHEW T. WEAVER AND MARC N. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2008, 90, 207 218 NUMBER 2(SEPTEMBER) TOLERANCE TO EFFECTS OF COCAINE ON BEHAVIOR UNDER A RESPONSE-INITIATED FIXED-INTERVAL SCHEDULE MATTHEW T. WEAVER AND

More information

Oddity learning in the pigeon: Effect of negative instances, correction, and number of incorrect alternatives

Oddity learning in the pigeon: Effect of negative instances, correction, and number of incorrect alternatives Animal Learning & Behavior 1980,8(4),621-629 Oddity learning in the pigeon: Effect of negative instances, correction, and number of incorrect alternatives THOMAS R. ZENTALL University ofkentucky, Lexington,

More information

Operant Conditioning B.F. SKINNER

Operant Conditioning B.F. SKINNER Operant Conditioning B.F. SKINNER Reinforcement in Operant Conditioning Behavior Consequence Patronize Elmo s Diner It s all a matter of consequences. Rewarding Stimulus Presented Tendency to tell jokes

More information