PERIODIC RESPONSE-REINFORCER CONTIGUITY: TEMPORAL CONTROL BUT NOT AS WE KNOW IT! MICHAEL KEENAN University of Ulster at Coleraine

Size: px
Start display at page:

Download "PERIODIC RESPONSE-REINFORCER CONTIGUITY: TEMPORAL CONTROL BUT NOT AS WE KNOW IT! MICHAEL KEENAN University of Ulster at Coleraine"

Transcription

1 The Psychological Record, 1999, 49, PERIODIC RESPONSE-REINFORCER CONTIGUITY: TEMPORAL CONTROL BUT NOT AS WE KNOW IT! MICHAEL KEENAN University of Ulster at Coleraine Two experiments using rats show the effects of introducing response-reinforcer contiguity on a modified recycling conjunctive fixed-time 30 s fixed-ratio 1 schedule of food reinforcement. In Experiment 1, the frequency of programmed contiguity was varied across conditions. The general finding was an increase in overall rate of responding for all rats when the incidence of obtained contiguity was relatively high. This increase in responding was accompanied by unusual response patterns for some rats. These patterns persisted when the rats were subsequently exposed to a fixed-interval schedule. Other rats produced fixed-interval-like performance when the incidence of obtained contiguity increased. Similar findings were observed in Experiment 2 where a more direct comparison was made between the modified recycling conjunctive schedule and the fixed-interval schedule. Results in general emphasize the importance of the modified recycling conjunctive schedule as a tool for exploring behavioral adaptation to periodic reinforcement. The history effects reported in both experiments are discussed with reference to the notion of schedule-induced behavior and temporal control. When significant events in the environment occur on a regular basis, people adapt by producing regularities in their behavior. Some of these events may be separated in time by a year or more (e.g., anniversaries) and others occur on a smaller time scale (e.g., weekends off work, lunch breaks during the day, etc.). In the laboratory, simulations of periodicity in behavior have been studied with both humans and nonhumans. The general approach has been to arrange a contingency between a selected behavior and an environmental event to ensure that the environmental event occurs at regular intervals. The technical term for This research was conducted when Michael Keenan was holder of a European Exchange Fellowship to the University of Cologne in Germany. I thank Prof. W. F. Angermeir for the facilities and support during my time there. Reprints may be obtained from Michael Keenan, School of Behavioural and Communication Sciences, University of Ulster, at Coleraine, Cromore Road, Coleraine, County Londonderry, N. Ireland, BT 52 1 SA. M,Keenan@ulst.ac,uk

2 274 KEENAN this arrangement is a 'schedule of reinforcement: A wide variety of schedules of reinforcement have been studied in the laboratory and each is associated with a distinctive pattern of behavior (Catania, 1998). One of the most studied laboratory procedures for investigating periodicity in behavior is the fixed-interval (FI) schedule of reinforcement. Reinforcer delivery on this schedule is dependent upon the occurrence of a single response after a fixed period of time has elapsed since the previous reinforcer presentation. Baseline performance on this schedule is typically described as involving a postreinforcement pause (PRP) followed by either an accelerating or a constant response rate up to the next reinforcer delivery (Baron & Leinenweber, 1994; Cumming & Schoenfeld, 1958; Dews, 1970; Ferster & Skinner, 1957; see Hyten & Madden, 1993, for a discussion of problems arising from imprecision in the description of human FI performance). In the analysis of this performance a variety of techniques of have been employed. These can be grouped together according to whether they involved simple parametric investigations of the interreinforcer interval, manipulation of the single response contingency, disruption of responding during the interreinforcer interval by the presentation of other stimuli, or the replacement of occasional reinforcer presentations by other stimuli (for extended discussions of these and other related procedures see Davey, 1987; Keenan, 1986; Lowe & Wearden, 1981; Richelle & Lejeune, 1980; Staddon, 1983; Zeiler, 1977). The analysis of patterns of behavior on schedules generally has proven difficult because even on the simplest of schedules it is recognized that behavior is multiply determined (Morse & Kelleher, 1997; Zeiler, 1997). Thus, although the formal description of a schedule may reference simply the programmed relation between the behavior and the environmental event, closer inspection shows that other variables operate collectively to produce baseline responding. For example, Keenan and Leslie (1986) (see also Keenan & Toal, 1991) offered a structural analysis of the independent variables that collectively define a FI schedule. They pointed out that there were four variables acting in concert: (a) the time between reinforcer presentations; (b) the single response contingency; (c) responsereinforcer contiguity; and (d) the time from one reinforcer presentation to the location in time of the next response dependency. The inspiration for this work came from the effects observed on another schedule that is similar in makeup to a FI schedule, a recycling conjunctive fixed-time (FT) fixed-ratio (FR) 1 schedule. A FI schedule can be seen as a tandem FT FR 1 schedule of reinforcement. Thus, once the FT component expires, and only then, a FR 1 contingency comes into operation. A major effect of this particular construction is that it ensures periodic occurrences of responsereinforcer contiguity. A recycung conjunctive FT FR 1 schedule is similar to a FI schedule in that it too has a single response contingency and it also presents reinforcement at regular intervals. However, unlike the FI

3 PERIODIC RESPONSE-REINFORCER CONTIGUITY 275 schedule a single response executed any time during a FT component results in reinforcer delivery at the end of that FT component. Also, if a response fails to occur during a FT component, that component ends without any stimulus event and the next FT component begins immediately. Although periodic reinforcement is assured in this arrangement, periodic response-reinforcer contiguity is not guaranteed. Baseline performance on this schedule is characterized by a variety of response-reinforcer delays, low overall rates of responding, and a response distribution markedly different from that found on a FI schedule (cf. Keenan & Watt, 1990). Notwithstanding this marked difference in performance between these two schedules, Keenan and Leslie (1986) demonstrated how a modified version of the recycling conjunctive FT FR 1 could be an extremely useful tool for exploring the effects of periodic responsereinforcer contiguity. The response strengthening effects of responsereinforcer contiguity have been well documented (e.g., Thomas, 1981), and from Keenan and Leslie's study it would seem that these effects within the context of regular occurrences of contiguity are a major contributing factor to overall performance on a FI schedule. Using the framework of the recycling schedule, Keenan and Leslie designed a procedure that produced an increased incidence of contiguity but did not alter the single response contingency, and at the same time had minimal effects on the duration of the FT component. They did this by changing the consequences of responding in either the final 2 s or 4 s of the FT component. A response during this terminal window of the FT component would not normally be contiguous with reinforcer presentation. Thus, for example, if a response occurred at 28 s on a 30- s FT component, then, assuming no other response occurred, there would be a delay of 2 s between this response and reinforcer delivery. In a modified version of the procedure, however, a response in a terminal window of a FT component produced a reinforcer immediately and terminated that component. To recap, throughout all FT components on a normal recycling conjunctive FT FR 1 schedule only one response is required within each component for a reinforcer to be delivered at the end of that component. In the modified version, however, if a response occurred in a terminal window of a FT component, response-reinforcer contiguity occurred. Keenan and Leslie found that performance on this modified schedule was markedly different from that observed on the normal schedule; the PRP was relatively unaffected, the usual pause-respond-pause patterning during the FT component was replaced by Fl-like patterning and there was a three-fold increase in overall response rate for some animals. The occurrence of FI-like patterning and the occurrence of a relatively large number of responses during the FT component, even though only one response was necessary for reinforcer presentation, are precisely the behavioral characteristics that have intrigued researchers interested in the FI schedule.

4 276 KEENAN Keenan and Watt (1990) highlighted the importance of this finding. This procedure is exciting because it not only gives the experimenter a certain degree of control over contiguity during the interreinforcer interval, but it does so without contaminating the spontaneous regulation on the schedule or without introducing major changes in the response contingency. Furthermore, differing response rates can be produced without introducing significant changes in overall rate of reinforcement. (p. 129) In other words, two schedules currently exist that differ radically in their formal specifications (i.e., the FI and the modified recycling conjunctive FT FR 1) but which produce similar patterns of behavior. The implications are that key variables shared by each schedule are responsible for the similarity in the patterning of behavior that is observed. A primary candidate is periodic response-reinforcer contiguity. In this study the versatility of the modified recycling conjunctive FT FR 1 was explored further. The primary objective of the first experiment was to examine the possibility that the procedure could be fine tuned so as to provide control over the percentages of obtained response-reinforcer contiguities. This was done by varying, in a semi random manner, the probability that the first response in the terminal segment of the FT component would produce reinforcement immediately. Another objective of the experiment arose incidentally from data obtained early in training. When response-reinforcer contiguities were first introduced, an unexpected behavior developed for some of the rats. They stood on the lever immediately after reinforcer presentation and began gnawing at various areas on the plastic walls and on the light fittings. This behavior was so vigorous that special metal panels had to be inserted into the chambers to cover up holes that were developing. This casing was inserted about 8 days after the abnormal responding had been established. It was decided to continue with the rats that engaged in this behavior because it was assumed that 'normal' responding would return after extended exposure to periodic reinforcement. When this proved not to be the case, it was decided to continue the experiment with these rats to see what behavior would develop eventually. The findings obtained add to the existing body literature concerned with the role played by behavioral history in determining schedule performance (cf. Bickel, Higgins, Kirby, & Johnston, 1988; Johnston, Bickel, Higgins, & Morris, 1991; Leinenweber, Nietzel, & Baron, 1996; Tatham, Wanschisen, & Hineline, 1993; Wanchisen, Tatham, & Mooney, 1989). Experiment 1 Method Subjects Nine experimentally naive 6-months old, male albino Wistar rats were used initially. They were caged in groups of three with water freely available, and they were maintained at approximately 85% of their freefeeding weights by feeding after experimental sessions.

5 PERIODIC RESPONSE-REINFORCER CONTIGUITY 277 Apparatus Eight Campden Instruments rat test chambers with plastic walls and light fittings were used. Later on in the study, metal casing had to be fitted to the inside of the chambers because some rats chewed holes in the walls and in the light fittings (see below). A single retractable lever was positioned in the center of the magazine wall. The magazine itself comprised of a recessed tray situated at floor level in the bottom lefthand comer of the wall. The reinforcer used was a 45-mg precision pellet (BioServ. Inc.) which was accompanied by a 3-s illumination of the tray light. Contingencies were controlled by an Apple 11+ computer programmed in BASIC. Procedure The sequence of conditions and the numbers of sessions in each condition for each rat are given in Table 1. After initial training on a continuous reinforcement schedule, all rats were transferred to a recycling conjunctive fixed-time (FT) 30 s fixed-ratio (FR) 1 schedule of reinforcement. On this schedule one response was required anywhere inside a 30-s cycle (i.e., the FT component) to produce a reinforcer at the end of that cycle. Failure to respond in any cycle meant that that cycle ended without a reinforcer, or the delivery of any other stimulus, and the next cycle began immediately. Unlike more conventional schedules, response-reinforcer contiguity (hereafter called "zero-delay" (ZD) reinforcement; cf. Keenan & Leslie, 1986) is not explicitly programmed on this schedule. Because the conditions which followed this initial condition were designed to manipulate the probability of obtainable programmed ZDs, this condition is referred to as 'Condition {FO.' Each session terminated after either 100 reinforcements or after 90 min, whichever occurred first, and all rats received one session daily from Monday through Saturday. Given that at least one response was required inside a FT component, the value obtained when the total number of reinforced cycles is expressed as a percentage of the total Table 1 Rat K1 Sequence of Conditions, Numbers of Sessions, and the Percentage Efficiency in Each Condition in Experiment 1 Condition Number of sessions p:0 p:0.2 p:0.4 p:0.6 p:0.8 p:1.0 Extinction FI26 s % Efficiency (6-.4~) ~ ro1~~ (3.3) (0.5) (1.8) 3 19

6 278 KEENAN K2 {)=O (9.4) {)= (11.8) {)= (12.5) {)= (2.6) K3 {)=O (1.9) {)= (0.5) {)=0.4 4 {)= (2.5) {)= (0.9) K4 {)=O (6.2) {)= (3.7) {)=0.4 3 p= (1.3) {)= (0.8) {)= (0.8) FI26 s 24 K6 {)=O (2.7) {)= (2.3) {)= (1.3) {)= (0.4) {)= (0.4) FI26 s 21 K8 {)=O (6.5) {)= (0.8) {)= (0.4) {)= (0.8) {)= (0.4) {)= (1.3) FI26 s 20 K10 {)=O (2.4) {)= (2.6) {)= (1.9) {)= (5.1) {)= (3.3) {)= (5.4) FI26 s 19 K12 {)=O (8.6) {)= (6.2) {)= (2.8) {)= (3.3) {)= (3.8) {)= (3.0) FI26s 19 K15 {)=O (7.0) {)= (1.4) {)= (0.8) {)= (1.9) {)= (1.8) FI26 s 24 Note. Figures in parentheses are the standard deviations of the session means for the percentage efficiency.

7 PERIODIC RESPONSE-REINFORCER CONTIGUITY 279 number of cycles that have elapsed in a session was used as a measure of performance efficiency. Behavior was considered stable in this, and in all other conditions, when average performance efficiency over six consecutive sessions was at least 70% and when there were no systematic directional changes in overall response rate. This stability criterion was used because of the intrinsic problems in controlling for the rate enhancing effects of accidental zero-delay reinforcements (see below). In the next series of conditions the basic recycling conjunctive schedule was modified so as to introduce a degree of control over the frequency of obtained ZOs. Again only one response was required anywhere inside a FT component for reinforcement to be delivered at the end of that component. This time, however, if a response occurred in the terminal 4 s of a FT component, the reinforcer was delivered immediately, depending upon the programmed probability of its occurrence. For example, if the probability of obtaining a ZO for the first response in the terminal 4 s was set at p:1.0, then this response produced a reinforcer immediately and terminated the cycle. Thus, only one reinforcer was presented at the end of a cycle. If, however, the probability of obtaining a ZO was set at p:0.4, then the first response in the terminal 4 s was only effective in producing a ZO on a maximum of 40 out of 100 occasions. Where a ZO was not produced by the first response in the terminal 4 s of a cycle, further responses were also ineffective in producing a ZO. Note, however, that a reinforcer was always delivered at the end of a FT component if the basic FR 1 requirement had been satisfied anywhere during it. This aspect of the schedule arrangement creates the possibility that accidental ZOs may occur. In summary, each cycle ended with a reinforcer if at least one response had occurred anywhere during it. Additionally, the probability of the first response in the terminal 4 s of a cycle producing a ZO was varied across conditions. For Rats K1, K2, K3, and K4 the probability of programmed ZOs was increased across conditions. However, because Rats K2 and K3 never actually obtained more than 20% of scheduled programmed ZOs in any condition, they were eventually eliminated from the study. For Rats K6, K8, K10, K12, and K15, the probability of programmed ZOs was varied in a semirandom manner across conditions. After the final session on the recycling conjunctive schedule, all rats were exposed to three sessions of extinction. During extinction the same recycling procedure was in effect, only this time food pellets were removed from the dispenser. Finally, all rats were exposed to a fixedinterval (FI) 26 s schedule of reinforcement. This value of the FI schedule was chosen because it matched the time on the recycling conjunctive schedule after which ZOs were available. Each session terminated after 100 reinforcers had been delivered. Results The results presented here for each rat are means calculated over the last five sessions in each condition. All rats except K1, K10, and K12 engaged in the unexpected gnawing behavior.

8 280 KEENAN Response Distributions Figures 1 and 2 show the response distribution in successive seconds after food for each rat (except K2 & K3) in each condition. In Condition p=o, responding was characterized by few responses throughout the interval for all rats. For K4, K6, K8, and K12, a pauserespond-pause pattern predominated, while for the other rats responding was relatively constant after about s into the interval. The final III III III c: o a. III III 0:: a Kl - P" 0 -- p"o p s 0.6 """'" - p " O.8 _. - p. 1 0,,-,--... '- --- PI I,,, 1 1,, 1 / 1 /,, - f..., I" a K4 - P"o -.- P " p.o 6 - p = O.8 _. - p= FJ,'1 /i <, r;t' i,' Seconds K6 -p " O _ - p " O-2 -_. p " p a O p " O FI ~ I - - '-",,../ "...,... a K8 - P"O - ' - p " O., P " 1-0,/v'" I~/ /.-il f y ' 30 a Figure 1. Mean number of responses in successive seconds after reinforcement for Rats K1, K4, K6, and K8, in each condition of Experiment 1. Note that because ZDs were introduced in the final 4 s of the FT component, response distributions for all conditions except Condition p=0 are shown for the first 26 s after reinforcement. condition for each rat was the FI schedule. Across rats a number of patterns emerged. K1, K10, and K12 (the rats who did not engage in gnawing) produced typical FI patterning in that after extended pausing responding accelerated up to the end of the interreinforcer interval. For the other rats, some unusual patterns were maintained. Responding for K4 and K8 began very early after reinforcer delivery and accelerated rapidly throughout the remainder of the interval. K6 and K15 produced bimodal response patterns with a peak in responding at about 5 seconds after reinforcer delivery and another peak towards the end of the interval. Between the first and last conditions the patterning across rats mirrored the differences in responding shown on the FI schedule. A consistent finding across rats, with the exception of K12, was that once responding had increased in the terminal segment of the interreinforcer interval in anyone condition it tended to remain elevated to the same extent. For K12, the increase in the rate of responding during the terminal segment of the interreinforcer interval in Condition p=0.8 was diminished substantially in the subsequent condition (Condition p=0.4).

9 PERIODIC RESPONSE-REINFORCER CONTIGUITY 281 K1U 1 0, , , 02 - p=o -. - p= p "'o I' 1-- K12 -p = O --- p= p = 0 4 " r I I K15 - p=o p = ' 0 _.- p. 0 8 r', / ' 1' '''..-I v r-r.,... 1 i... i ' '!:..~,' i," <0./ --"~./""'., p = 0 2 _.- p p FI., --- p= ' 0 _.- p= p= FI,, - p. 0 2 _.-p=0 6 FI 0, Seconds C Figure 2. Mean number of responses in successive seconds after reinforcement for Rats K10, K12, and K15, in each condition of Experiment 1. Note that because ZDs were introduced in the final 4 s of the FT component, response distributions for all conditions except Condition p=0 are shown for the first 26 s after reinforcement. Response Rate and ZOs Figure 3 shows response rate and the total numbers of obtained ZOs, programmed and accidental, for each rat in each condition. (An accidental ZO was defined as a delay between a response and a reinforcer which was less than 1 s.) In the first condition (Condition p:=o) the maximum overall response rate of just over 10 responses per minute was produced by K15. The maximum number of accidental ZOs obtained by any rat in this condition was about 20 out of a possible 100, and this was by K15. Across animals there were variations in the extent to which response rate and numbers of programmed ZOs were related. For K1 and K4 there was some correspondence between the increasing probability of programmed ZOs across conditions and the total numbers of ZOs obtained. Up to Condition p::::1.0, increases in the percentage of obtained ZOs were mirrored by increases in overall response rate such that the highest rates on the recycling conjunctive schedule were recorded in this condition. Thereafter, however, the effect of the FI schedule on overall response rate differed for these two rats. For K1, response rate decreased quite substantially, and for K4 it continued to rise slightly. The remaining rats (K6, K8, K10, K12, & K15) had the probability of programmed ZOs varied in a semi random manner across conditions.

10 282 KEENAN 50 R~:,:~~S ] o ~+"-:=-'-=c'-:-l...l-, No. of 100] Zero-Delays 0 ~!!'L:~.L...J~...J K6 K15 Responses50j per min. K2 o ~~c::::l_-, No. of 100] Zero-Delays Zero-delays Accidental o Programmed o 0 50 Responses ] per min.... '" o o o K3 K10 o J.::.!-:~~----, 50 Responses ] per min. K4 K12 o F-2-,""'"',...,...,...,,~ No. of 100] Zero-Delays 0 "=,,,"~-'-=:-"""""_-' O~Q)ct)o <i <i o - ' Conditions Figure 3. Overall response rate (top axis) and total number of ZOs (bottom axis) in each condition for each rat in Experiment 1.

11 PERIODIC RESPONSE-REINFORCER CONTIGUITY 283 Two main effects emerged. Firstly, once the total numbers of ZOs for K6 and K8 had reached a maximum, subsequent decreases in the probability of programmed ZOs had little effect on their frequencies; there was, though, a high percentage of accidental ZOs. Overall response rates for these two rats tended to increase slightly across conditions. The second type of effect which emerged across conditions was seen for K10, K12, and K15. In the main, relatively large increases or decreases in the numbers of ZOs produced corresponding changes in the overall response rates. It was noted again that decreases in the probability of programmed ZOs did not prevent substantial numbers of accidental ZOs from occurring. 30 Kl K8 K15 c o... :::J o 30 -c 20 ~ 10 :::J C a.. K4 Kl K6 K12 10 Conditions Figure 4. Median postreinforcement pause duration for each rat (except K2 & K3) in each condition in Experiment 1. Bars indicate standard deviations of the session means.

12 284 KEENAN Although K2 and K3 were eliminated eventually from the study, some of their results are included (see Figure 3). The reason for this is to demonstrate some interesting difficulties encountered in gaining control over the production of ZOs by the procedures used here. Both rats developed a very pronounced pause-respond-pause pattern with almost 90% of responses occurring in the middle 10 s of the interval after food. Consequently, as the probability of programmed ZOs was increased across conditions for these rats, there was not a corresponding increase in the total numbers of obtained ZOs. In fact, for K2 there were more ZOs in Condition p:o.2 than in Condition p:o.6. Pausing The median postreinforcement pause (PRP) durations are shown for all rats (except K2 & K3) in Figure 4. Pauses ranged from about 7-17 s in this first condition. Thereafter the effects on pausing varied across rats. For Rats K1, K10, and K12 the largest PRPs occurred in the final condition (the FI schedule). For the other animals there was a general tendency for PRPs to decrease across conditions so that minimal pausing occurred on the FI schedule. Discussion Previous studies with Sprague Oawley rats have shown that the recycling conjunctive FT x s FR 1 schedule produces low overall response rates. Also, responding during the interreinforcer interval is characterized by either a pause-respond-pause pattern, or else a pause followed by a constant low response rate throughout the rest of the interval. Similar findings were reported here across all rats in Condition p:o. Ouring this condition, the incidence of obtained, accidental ZOs was very low. The introduction of programmed ZOs produced a number of effects. When K2 and K3 are excluded (because the numbers of obtained ZOs did not correspond with the opportunity for obtaining them), the main effect for other rats was an increase in overall rate of responding (ct. Keenan & Leslie, 1986). For some rats (K4, K6, & K8), once response rate had increased, it was maintained at this level even though the probability of programmed ZOs was varied across conditions. This can be accounted for by noting that the numbers of accidental ZOs kept the overall total of ZOs relatively high. Other rats (K10, K12, & K15) showed evidence of differential control over response rates by programmed ZOs. That is, when variations in the probability of programmed ZOs resulted in variations in the overall numbers of obtained ZOs overall response rates varied accordingly. One of the most intriguing aspects of the findings concerns the effects on response patterning. Usually schedules of reinforcement are noted for the consistency in response patterning that they produce across non humans. This was not the case here. Instead a variety of performances was obtained across rats on the final FI schedule. That is,

13 PERIODIC RESPONSE-REINFORCER CONTIGUITY 285 they exhibited predictable, stable patterns of responding as a function of their history of responding on the modified recycling conjunctive schedule (cf. Johnston et ai., 1991). Of these, only K1, K10, and K12 produced 'typical' FI patterning. The performances of these rats prior to the FI condition were reminiscent of findings reported previously by Keenan and Leslie (1986). That is, they resembled FI patterning. Interestingly, these rats did not engage in the unusual gnawing behavior noted earlier. For the other rats who did engage in this behavior there were some highly unusual patterns produced on the final FI schedule. K4 and K8 increased responding across the interreinforcer interval, and they had unusually small PRPs. Similar PRP durations occurred for K6 and K15, but patterning was bimodal. These unusual response patterns are all the more remarkable when you bear in mind that each rat had over 100 sessions of periodic food presentation prior to being exposed to the FI schedule. In other words, even though there was evidence for periodicity in behavior, the patterning normally associated with temporal control on FI schedules did not emerge (Richelle & Lejeune, 1980). Instead, unintentional history effects occurred. These findings were fortuitous and they are the first recorded instances of such behavior that is uncontaminated by different response requirements and different interreinforcer intervals prior to exposure on a FI schedule (Bickel et ai., 1988; Johnston et ai., 1991 ; Leinenweber et ai., 1996; Tatham et ai., 1993; Wanchisen et ai., 1989). An account of the unusual patterns obtained for these animals is deferred until the General Discussion. In general, these findings have been relatively successful in regards to the first objective of the study. They demonstrate that it is possible to attain a fair degree of control over the numbers of obtained response-reinforcer contiguities on a temporally based schedule that delivers periodic food reinforcement. Precise control, however, was not possible because the rate-enhancing effects of ZDs, whether accidental or programmed, ensured that the overall numbers of obtained ZDs was often higher than programmed. Following on from this, though, the results show how both response rate and patterning are affected by the incidence of ZDs. This is an important finding in so far as all previous research with the FI schedule has confounded the contribution played by the temporal distribution of response-reinforcer contiguities with that played by the temporal distribution of response dependencies (Keenan, 1982). Results from two animals (K2 & K3) show also some of the difficulties in working with this schedule. If responding does not occur in the terminal window of the FT component ZDs do not occur and responding does not increase in rate. This was the case for these animals despite the fact that they were exposed to about 100 sessions each of periodic reinforcer presentations. The response patterning produced by these animals has an important bearing on theoretical accounts of FI performance and this is discussed later in the General Discussion. In the next experiment, the objective was simply to see how similar were the performances of both the modified recycling conjunctive with

14 286 KEENAN maximum opportunity to obtain ZDs (i.e., Condition p:1.0) and the FI schedule. The experiment had commenced at the same time as Experiment 1 and thus none of the findings reported above had yet been obtained. It had been assumed on the basis of previous work with the recycling conjunctive that it would be a relatively straight forward task. However, similar disruptions in patterning occurred for the same reasons (i.e., excessive gnawing of the plastic casing of the chamber). As before, rats that engaged in this behavior were retained for the study because it was assumed (incorrectly as it turned out) that the disruption would abate after continued exposure to periodic reinforcement. Experiment 2 Method Subjects Six experimentally naive 6-months old, male albino Wistar rats were used initially. They were caged in pairs with water freely available, and they were maintained at approximately 85% of their free-feeding weights by feeding after experimental sessions. Apparatus The apparatus described previously was also used here. Procedure The sequence of conditions and the numbers of sessions in each condition for each rat are given in Table 2. After initial training on a continuous reinforcement procedure, two rats (K26 & K28) were transferred to a recycling conjunctive FT 30 s FR 1 schedule of reinforcement (Condition p:0; see Experiment 1). The other rats (K32, K33, K34, & K35) were transferred to the modified version of this schedule with programmed ZDs (Condition p:1.0; see Experiment 1). Throughout all conditions a session terminated after 100 reinforcements, or after 90 min, wh ichever occurred first, and all rats received one session daily from Monday through to Saturday. The stability criteria used in Experiment 1 were also used here. Using an ABA reversal design, Rats K26 and K28 were transferred to Condition p:1.0 before returning to Condition p:0. Following the last session in Condition p:0, the food pellets were removed from the dispenser and the rats were exposed to three sessions of extinction with this procedure. Thereafter they were transferred to a FI 26 s schedule of reinforcement. Finally, three sessions in extinction with the FI procedure (Le., using an empty pellet dispenser) preceded a return to Condition p:1.0 For the other rats an ABA reversal design was used whereby they were transferred from Condition p:1.0 to a FI 26 s schedule and then back to Condition p:1.0. Three sessions of extinction with the procedure last used separated the transfer across conditions for each rat.

15 PERIODIC RESPONSE-REINFORCER CONTIGUITY 287 Table 2 Sequence of Conditions, Numbers of Sessions, and Percentage Efficiency in Each Condition in Experiment 2 Rat Condition Number of sessions % Efficiency K26 p= (4.5) p= (0.0) p= (1.0) FI26 s 18 p= (0.8) K28 p= (2.4) p= (0.0) p= (0.8) FI26 s 22 p= (0.4) K32 p= (0.0) FI26 s 23 p= (0.0) K33 p= (1.7) FI26 s 19 p= (2.1) K34 p= (0.4) FI26 s 25 p= (0.0) K35 p= (0.8) FI26 s 26 p= (0.0) Note. Figures in parentheses are the standard deviations of the session means for the percentage efficiency. Results The results presented here for each rat are means calculated over the last five sessions in each condition. All rats except K33 engaged in unexpected gnawing behavior. There were over 90 ZDs for all rats in Condition p:1.0. Response Distributions Figure 5 shows the response distribution in successive seconds after

16 288 KEENAN food for each rat in each condition. In Condition p=0 for K26 and K28, patterning was very similar in that responding started off at a low level within the first few seconds of the interval and it accelerated slowly across the interval before leveling off in the final 10 seconds. When programmed ZDs were introduced in Condition p=1.0 changes in response patterning were comparable for both rats. That is, there was an abrupt increase in '" II> 0 0 '" c: '" II> Q: K26 K28 K32 K34 - p. o -p ao --- p a ' p _.- pt O(2nd l -,.... -' t~~ ~.;.., _/.. i '.r ",,,,,, P " O,:~~V'" (::/"-/ _.- FI -.- F I - p.,.o (2nd ) - p. l.0 (2 ndl K33 K Seconds Figure 5. Mean number of responses in successive seconds after reinforcement for each rat in each condition of Experiment 2. Note that because ZDs were introduced in the final 4 s of the FT component, response distributions for all conditions except Condition p:o are shown for the first 26 s after reinforcement. responding in the first 5 s after food followed by an acceleration in responding throughout the remainder of the interval. When Condition p=0 was reintroduced the abrupt increase in responding seen at the start of the interval remained, but it was followed by a lower and fairly constant rate thereafter. For K32 and K35 the first exposure to Condition p=1.0 produced patterns similar to those seen with K26 and K28. Patterning for the other two rats was different. K34 responded at a fairly constant rate across the interval, whereas patterning for K33 looked like that normally found on a FI schedule. When all of the rats were exposed to the FI schedule one main effect emerged. That is, the major characteristics of response patterning in the previous condition for each rat were replicated. Patterning for four rats (K26, K28, K32, & K34) appeared bimodal with a slight decrease in responding up to the first 10 s of the interval (20 s for K34) followed by acceleration thereafter. In the final condition (Condition p=1.0), the performance observed during the FI schedule for each rat was replicated, with slight increases in overall responding across the interval for K34 and K35.

17 PERIODIC RESPONSE-REINFORCER CONTIGUITY 289 Response Rate Overall response rates for each animal in each condition are shown in Figure 6. For K26 and K28 the first transition to Condition p:1.0 produced a three-fold increase in response rate. When K26 was returned to Condition p:0 response rate decreased markedly; this did 60 K26 K c: E Q; 0... '----'------'-----''----'----' a. <:),~ <:) «',~ VI CII VI c: 60 o a. VI CII a:: 40 K32 K33 K35 20 Cond it ions Figure 6. Overall response rate for each rat in each condition in Experiment 2. Bars indicate standard deviations of the session means. not happen for K28. In subsequent conditions, response rates were similar in each condition for these two rats. Across the other rats different effects emerged. For K33 and K35 response rate was comparable throughout all conditions. Rate decreased across conditions for K32. Rat K34 produced a staggering rate of about 60 responses per minute in the first condition. This dropped to just under 40 responses per minute in the second condition and increased slightly again in the final condition. Pausing PRPs for each animal in each condition are given in Figure 7. Rats K26 and K28 produced pauses Qf about s in the first condition. Thereafter there were substantial reductions in pause

18 290 KEENAN 30 K26 K33 20 c ~ ::J C QI II) ::J 0 (l. 30 K ~,~ ~~, '\:~ ~ " K34 ~, ~ " 30 K32 K o..l...l...:...jl.j::::::::::::u:::::::::j..._,~ ~,,~ Condit ions Figure 7. Median postreinforcement pause duration for each rat in each condition in Experiment 2. Bars indicate standard deviations of the session means.

19 PERIODIC RESPONSE-REINFORCER CONTIGUITY 291 duration so that by the final three conditions they were pausing for less than 5 s after each reinforcer delivery. Pausing for K32, K34, and K35 was minimal also, but this was more extreme for K34. Rat K33 consistently produced the longest pauses in all conditions, between 17 and 20 s. Discussion The primary objective of Experiment 2 was to compare performances produced by two schedules that each produce periodic response-reinforcer contiguity (ZDs). These schedules were the modified recycling conjunctive FT 30 s FR 1 (Keenan & Leslie, 1986) that provided maximum opportunity to obtain ZDs (Condition p=1) and a FI 26 s schedule of food reinforcement. Of the six rats that were used, only one rat (K33) produced results that were not complicated by an unexpected history of gnawing. The rats who gnawed all had unusually elevated levels of responding in the first 10 seconds immediately after reinforcer delivery. For K26 and K28, who were first exposed to the normal recycling conjunctive, elevations in responding that occurred during this period on the modified schedule were not reversed when the normal schedule was reintroduced; such findings occurred also in Experiment 1. In addition, exposure to the FI contingencies did not produce substantial changes in patterning across animals. Rather, patterning on the FI schedule was determined by patterning on the immediately preceding schedule. Across all of these rats it might have been expected that continued exposure to periodic reinforcer presentation per se would have resulted in extended pausing after reinforcer delivery, as is normally the case (Richelle & Lejeune, 1980). Instead, the unusual patterns of responding on both the modified recycling conjunctive schedule and on the FI schedule remained resistant to change. For K33, performance on the modified recycling conjunctive schedule and on the FI schedule was undisturbed by gnawing. FI-like patterning occurred on the modified schedule and across conditions performances were virtually indistinguishable. This finding, in conjunction with results in Experiment 1 by K1, K10, and K12 lends support for the view that periodic response-reinforcer contiguity on a FI schedule is a major determinant of response patterning (see also Keenan & Toal, 1991; Keenan & Watt, 1990). The ability of the modified recycling conjunctive schedule to produce FI-like patterning holds much promise for other research. Future studies could, for example, map in more detail than presented here the similarities and differences in responding on both schedules (see Richelle & Lejune, 1980, for extended discussions on molecular and molar levels of analysis of FI performance). This information might help to pinpoint which aspects of the dynamics on these schedules is responsible for the different performances. This focus on

20 292 KEENAN dynamics would necessitate a systemslbased language (Keenan & Toal, 1991) that might open the way for classifying schedules in a way that transcends the traditional methods used at present. Keenan and Watt (1990) touched on this issue when they discussed the differences between four schedules, the FI (i.e., tandem FT FR 1), the conjunctive FT FR 1, the recycling conjunctive FT FR 1, and the FT schedule. Although three of these schedules are constructed from the same elements, it is the differing dynamics inherent in their different structures that crucially determines baseline performance. This point was emphasized when a comparison was made between the FT schedule and the other schedules: In view of the fact that a FI schedule represents but one of a number of systems comprising a response contingency and periodic food presentation,... it is incumbent upon us to use other related systems in order to clarify the manner in which this particular combination is distinctive. (pp ) Other studies of the modified recycling conjunctive might be able to determine also whether or not there is a threshold of exposure to response-reinforcer contiguity that leads to rapid changes in patterning and rate of responding. One thing that was not clear from the results obtained here was the minimum relative frequency of contiguity needed to produce FI-like patterning. Any future studies that address this issue must deal with the relative effects of accidental and programmed contiguities without contaminating the single response contingency on the schedule. General Discussion Two general conclusions emerged from these experiments. Firstly, FI-like patterning can occur on a modified recycling conjunctive schedule that allows for a relatively high frequency of periodic response-reinforcer contiguity. Secondly, the pattern of responding obtained on this schedule can determine subsequent performance on a FI schedule. If we look at the first conclusion we can ask the following questions. How are such marked differences in the formal specifications of the FI and modified recycling conjunctive schedules to be reconciled with the similarity of performance they can produce? Do they produce similar performances by different means? The answer to these questions must await the results of future studies that address the nature of the 1 Keenan and Toal (1991) defined a schedule in the following way: "a schedule is more properly conceived as providing an opportunity for observing the dynamic behavioral system that "crystallizes out" when a biological system is exposed to environmental constraints... At anyone instance, the characteristics of the behavioral system are dependent upon the interplay between the "plasticity" or dynamic limitations inherent in the adaptiveness of the biological system, and the dynamics imposed across time by the structure of the prevailing contingencies." (p. 113)

21 PERIODIC RESPONSE-REINFORCER CONTIGUITY 293 processes which occur during the acquisition phase of baseline responding. A focus on acquisition would be in keeping with the suggested need to map the behavioral dynamics on these schedules. Another argument for looking at acquisition more closely comes from the unusual patterns of responding which occurred during the interreinforcer interval for many of the animals. Processes occurring during acquisition produced these patterns. Once established they were maintained within the context of periodic response-reinforcer contiguity during baseline conditions. A related argument is that periodic food presentation per se does not result inevitably in extended pausing and bunching of responding in the terminal segment of the interreinforcer interval, a finding that is common with extended exposure on a FI schedule (Richelle & Lejune, 1980). Results from K1 and K2 bear this out. Although the impact of history effects on FI patterning has been reported elsewhere (e.g., Baron & Leinenweber, 1995) none of these studies provided histories using regular food presentation in conjunction with a single response contingency. It seems that the history effects reported here, although unintentional and interesting in their own right, are fortuitous in so far as they help draw attention to an important and useful distinction between the FI schedule and the modified recycling conjunctive schedule. Although these schedules are similar in the components which comprise their formal specifications, they differ markedly in the dynamics that they each support. Behavior on the modified recycling conjunctive schedule reflects the combined effects of accidental contiguities (between any behavior and reinforcer delivery) and programmed contiguities (between the selected operant and reinforcer delivery) within the context of a system that does not arrange contiguous reinforcement for extended pausing (as would be the case on a FI schedule) and which arranges reinforcement (albeit delayed) for responding during the FT component. Although these schedules differ in their dynamics because of the way the response contingency is designed, they share the effects of an important variable, the FT component. Numerous studies have shown that on its own a FT schedule of reinforcer delivery controls distinctive temporal distributions of behavior. It has been shown repeatedly that various categories of behavior become organized into a sequential pattern during the interreinforcer interval (Anderson & Shettleworth, 1977; Staddon, 1977). Of particular interest here are those kinds of behaviors (interim behaviors) which occur immediately after food delivery. The gnawing that was observed in this study was probably an instance of this behavior. The main support for this argument comes from the temporal location of the behavior. It reliably occurred only in the first seconds of the interval (ct. Keenan & Watt, 1990). Unusually, though, it seems that this behavior became incorporated into the behavior that was selected initially as the operant. This possibility is supported by the finding of variations in the rate of lever depression across conditions during the period immediately after reinforcer delivery.

22 294 KEENAN Because these animals either stood on the lever or chewed it, there is no way to distinguish operant pressing of the lever from depression of the lever that arose incidentally from other schedule-induced gnawing. Hence the unusual patterns that were recorded. Future studies could explore the possibility that interim activities can interact with an operant in the manner suggested by first inducing interim activities on a FT schedule that has a chewable, but inoperative operandum. The FT schedule might be changed then to a modified recycling conjunctive schedule by attaching the response contingency to the operandum. Such a study might look also at the possibility that performances produced here are in some way related to the strain of rat used; previous studies with the recycling conjunctive schedule have Sprague Dawleys whereas albino Wistar rats were used here. To conclude, the mixture of patterns observed across animals in both experiments also has important implications for how we view the notion of 'temporal control.' To put it briefly, discussion of temporal control often arises in response to the baseline patterning that appears on a schedule where reinforcers are presented at regular intervals of time. Historically the patterning frequently discussed is that found on FI schedules. There are problems, however, with the way in which the notion of temporal control is sometimes used both to describe and explain performance (Leslie, 1996). At the descriptive level, the patterning on a FI schedule is said to reflect temporal control by virtue of its synchronization with the periodicity of the reinforcer presentation. On balance, the explanation for the pattern is sometimes said to reflect an animal's 'ability to discriminate the passage of time.' Blackman (1983) discussed how the seductive nature of temporal patterning in behavior can result in the cognitive metaphor of an internal clock: A particularly interesting paper in the field of cognitive learning theory is that of Church (1978). The paper reports the results of an extensive series of experiments which reveal how rats' leverpressing behavior can become functionally related to the passage of time, that is how temporal patternings develop in operant behavior as a result of certain schedules of reinforcement. Behavior analysts would seek ways of capturing and describing (a) the temporal pattern of behavior, (b) the temporal patterning of environmental events, and (c) the functional relationships between these two measures. Church himself (p. 282) recognizes that his experiments 'provide ample evidence that there is a relationship between time and behavior'. However the evolution of Church's attempts to explain [emphasis in original] this relationship is illuminating,... Church's data describing functional relationships between behavior and environmental events takes second place to a clock that runs, stops, or runs at different speeds, but which cannot be seen by the experimenter. The clock apparently has to be consulted by a positive decision on the part of the rat, who then

23 PERIODIC RESPONSE-REINFORCER CONTIGUITY 295 decides how to behave as a result of his reading. The behavior analysts' position can be succinctly conveyed by the suggestion that it would be more profitable to conceptualize the rat as [emphasis in original] the clock. Rather than reflecting the operation ill [emphasis in original] a clock, the rat's behavior would now be said to have [emphasis in original] the properties of a clock in certain environmental conditions. The functioning of this clock (i.e., this behavior) may differ in different environmental conditions, and it is the task of experimental analysis to identify these conditions and their effects on the clock (behavior). (pp ) Imagine for a moment that the findings reported here were the first of their kind. That is, imagine that the data presented here represented the first findings using schedules with FT components. I suspect that the variety of patterns shown here would preclude the use of the term temporal control as outlined by Church. Explanations that refer to the operation an internal clock as an independent variable would be forced to invent as many different kinds of clocks, or clock rates as there were patterns of responding. Also, because all of the 'clocks' were controlled by the same temporal distribution of reinforcers, one would have to explain why the rats 'decided to do different things at the same time in the interval.' Another problem with the notion of temporal control as an explanatory term is that it requires decisions to be made for determining the criteria that distinguish 'poor temporal control' from 'accurate temporal control.' In regards to the patterns reported here this would be extremely problematical. A natural science perspective of these response patterns, however, has no need for the term temporal control as traditionally conceived. Because they are all instances of adaptive behavior they can not be graded as examples of either poor or accurate temporal control. They are explained only by reference to the dynamics afforded by the structure of the prevailing contingencies. References ANDERSON, M. C., & SHETTLEWORTH, S. J. (1977). Behavioral adaptation to fixed-interval and fixed-time food delivery in golden hamsters. Journal of the Experimental Analysis of Behavior, 27, BARON, A., & LEINENWEBER, A. (1994). Molecular and molar analyses of fixed-interval performance. Journal of the Experimental Analysis of Behavior, 61, BARON, A., & LEINENWEBER, A. (1995). Effects of a variable-ratio conditioning history on sensitivity to fixed-interval contingencies in rats. Journal of the Experimental Analysis of Behavior, 63, BICKEL, W. K., HIGGINS, S. T., KIRBY, K., & JOHNSTON, L. M. (1988). An inverse relationship between baseline fixed-interval response rate and the effects of a tandem response requirement. Journal of the Experimental Analysis of Behavior, 50,

24 296 KEENAN BLACKMAN, D. E. (1983). On cognitive theories of animal learning: Extrapolation from humans to animals? In G. C. L. Davey (Ed.), Animal models of human behavior. New York: John Wiley & Sons Ltd. CATANIA, A. C. (1998). Learning (4th ed.). Upper Saddle River, NJ: Prentice Hall,lnc. CHURCH, R. M. (1978). The internal clock. In S. H. Hulse, H. Fowler, & W. K. Honig (Eds), Cognitive processes in animal behavior. Hillsdale, NJ: Erlbaum Associates. CUMMING, W. w., & SCHOENFELD, w. N. (1958). Behavior under extended exposure to a high-value fixed-interval schedule. Journal of the Experimental Analysis of Behavior, 1, DAVEY, G. (1987). Animal learning and conditioning. London: Macmillan Education. DEWS, P. B. (1970). The theory of fixed-interval responding. In W. N. Schoenfeld (Ed.), The theory of reinforcement schedules (pp ). New York: Appleton-Century-Crofts. FERSTER, C. B., & SKINNER, B. F. (1957). Schedules of reinforcement. New York: Appleton-Century-Crofts. HYTEN, C., & MADDEN, G. J. (1993). The scallop in human fixed-interval research: A review of problems with data description. The Psychological Record, 43, JOHNSTON, L. M., BICKEL, W. K., HIGGINS, S. T., & MORRIS, E. K. (1991). The effects of schedule history and opportunity for adjunctive responding on behavior during a fixed-interval schedule of reinforcement. Journal of the Experimental Analysis of Behavior, 55, KEENAN, M. (1982). Behavioural organisation in schedules of reinforcement. Unpublished D. Phil. thesis. University of Ulster. KEENAN, M. (1986). Second-order schedules. The Psychological Record, 36, KEENAN, M., & LESLIE, J. C. (1986). Varying response-reinforcer contiguity in a recycling conjunctive schedule. Journal of the Experimental Analysis of Behavior, 45, KEENAN, M., & TOAL, L. (1991). Periodic reinforcement and second-order schedules. The Psychological Record, 41, KEENAN, M., & WATT, A. (1990). Concurrent behavior and response reinforcer contiguity. The Psychological Record, 40, LEINENWEBER, A., NIETZEL, S. M., & BARON, A. (1996). Temporal control by progressive-interval schedules of reinforcement. Journal of the Experimental Analysis of Behavior, 66, LESLIE, J. C. (1996). Principles of behavior analysis. Amsterdam: Harwood Academic Publishers GmbH. LOWE, C. F., & WEARDEN, J. H. (1981). Weber's law and the fixed-interval postreinforcement pause. Behavior Analysis Letters, 1, MORSE, W. H., & KELLEHER, R. T. (1977). Determinants of reinforcement and punishment. In W. K. Honig & J. E. R. Staddon (Eds.), Handbook of operant behavior (pp }. Englewood Cliffs, NJ: Prentice-Hall. RICHELLE, M., & LEJEUNE, H. (1980). Time in animal behaviour. Oxford: Pergamon. STADDON, J. E. R. (1983). Adaptive behavior and learning. New York: Cambridge University Press. STADDON, J. E. R. (1977). Schedule-induced behavior. In W. K. Honig & J. E. R. Staddon (Eds.), Handbook of operant behavior (pp ). Englewood Cliffs, NJ: Prentice-Hall.

PROBABILITY OF SHOCK IN THE PRESENCE AND ABSENCE OF CS IN FEAR CONDITIONING 1

PROBABILITY OF SHOCK IN THE PRESENCE AND ABSENCE OF CS IN FEAR CONDITIONING 1 Journal of Comparative and Physiological Psychology 1968, Vol. 66, No. I, 1-5 PROBABILITY OF SHOCK IN THE PRESENCE AND ABSENCE OF CS IN FEAR CONDITIONING 1 ROBERT A. RESCORLA Yale University 2 experiments

More information

CONDITIONED REINFORCEMENT IN RATS'

CONDITIONED REINFORCEMENT IN RATS' JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1969, 12, 261-268 NUMBER 2 (MARCH) CONCURRENT SCHEULES OF PRIMARY AN CONITIONE REINFORCEMENT IN RATS' ONAL W. ZIMMERMAN CARLETON UNIVERSITY Rats responded

More information

Variability as an Operant?

Variability as an Operant? The Behavior Analyst 2012, 35, 243 248 No. 2 (Fall) Variability as an Operant? Per Holth Oslo and Akershus University College Correspondence concerning this commentary should be addressed to Per Holth,

More information

Instrumental Conditioning I

Instrumental Conditioning I Instrumental Conditioning I Basic Procedures and Processes Instrumental or Operant Conditioning? These terms both refer to learned changes in behavior that occur as a result of the consequences of the

More information

DISCRIMINATION IN RATS OSAKA CITY UNIVERSITY. to emit the response in question. Within this. in the way of presenting the enabling stimulus.

DISCRIMINATION IN RATS OSAKA CITY UNIVERSITY. to emit the response in question. Within this. in the way of presenting the enabling stimulus. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR EFFECTS OF DISCRETE-TRIAL AND FREE-OPERANT PROCEDURES ON THE ACQUISITION AND MAINTENANCE OF SUCCESSIVE DISCRIMINATION IN RATS SHIN HACHIYA AND MASATO ITO

More information

ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE

ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE BROWN UNIVERSITY 1967, 10,

More information

Stimulus control of foodcup approach following fixed ratio reinforcement*

Stimulus control of foodcup approach following fixed ratio reinforcement* Animal Learning & Behavior 1974, Vol. 2,No. 2, 148-152 Stimulus control of foodcup approach following fixed ratio reinforcement* RICHARD B. DAY and JOHN R. PLATT McMaster University, Hamilton, Ontario,

More information

MOUNT ALLISON UNIVERSITY

MOUNT ALLISON UNIVERSITY JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 00, 79, 6 NUMBER (MARCH) RESPONDING FOR SUCROSE AND WHEEL-RUNNING REINFORCEMENT: EFFECTS OF SUCROSE CONCENTRATION AND WHEEL-RUNNING REINFORCER DURATION

More information

FIXED-RATIO PUNISHMENT1 N. H. AZRIN,2 W. C. HOLZ,2 AND D. F. HAKE3

FIXED-RATIO PUNISHMENT1 N. H. AZRIN,2 W. C. HOLZ,2 AND D. F. HAKE3 JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR VOLUME 6, NUMBER 2 APRIL, 1963 FIXED-RATIO PUNISHMENT1 N. H. AZRIN,2 W. C. HOLZ,2 AND D. F. HAKE3 Responses were maintained by a variable-interval schedule

More information

DOES THE TEMPORAL PLACEMENT OF FOOD-PELLET REINFORCEMENT ALTER INDUCTION WHEN RATS RESPOND ON A THREE-COMPONENT MULTIPLE SCHEDULE?

DOES THE TEMPORAL PLACEMENT OF FOOD-PELLET REINFORCEMENT ALTER INDUCTION WHEN RATS RESPOND ON A THREE-COMPONENT MULTIPLE SCHEDULE? The Psychological Record, 2004, 54, 319-332 DOES THE TEMPORAL PLACEMENT OF FOOD-PELLET REINFORCEMENT ALTER INDUCTION WHEN RATS RESPOND ON A THREE-COMPONENT MULTIPLE SCHEDULE? JEFFREY N. WEATHERLY, KELSEY

More information

IVER H. IVERSEN UNIVERSITY OF NORTH FLORIDA. because no special deprivation or home-cage. of other independent variables on operant behavior.

IVER H. IVERSEN UNIVERSITY OF NORTH FLORIDA. because no special deprivation or home-cage. of other independent variables on operant behavior. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR TECHNIQUES FOR ESTABLISHING SCHEDULES WITH WHEEL RUNNING AS REINFORCEMENT IN RATS IVER H. IVERSEN UNIVERSITY OF NORTH FLORIDA 1993, 60, 219-238 NUMBER 1

More information

PURSUING THE PAVLOVIAN CONTRIBUTIONS TO INDUCTION IN RATS RESPONDING FOR 1% SUCROSE REINFORCEMENT

PURSUING THE PAVLOVIAN CONTRIBUTIONS TO INDUCTION IN RATS RESPONDING FOR 1% SUCROSE REINFORCEMENT The Psychological Record, 2007, 57, 577 592 PURSUING THE PAVLOVIAN CONTRIBUTIONS TO INDUCTION IN RATS RESPONDING FOR 1% SUCROSE REINFORCEMENT JEFFREY N. WEATHERLY, AMBER HULS, and ASHLEY KULLAND University

More information

CONTINGENCY VALUES OF VARYING STRENGTH AND COMPLEXITY

CONTINGENCY VALUES OF VARYING STRENGTH AND COMPLEXITY CONTINGENCY VALUES OF VARYING STRENGTH AND COMPLEXITY By ANDREW LAWRENCE SAMAHA A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR

More information

The generality of within-session patterns of responding: Rate of reinforcement and session length

The generality of within-session patterns of responding: Rate of reinforcement and session length Animal Learning & Behavior 1994, 22 (3), 252-266 The generality of within-session patterns of responding: Rate of reinforcement and session length FRANCES K. MCSWEENEY, JOHN M. ROLL, and CARI B. CANNON

More information

RESPONSE PERSISTENCE UNDER RATIO AND INTERVAL REINFORCEMENT SCHEDULES KENNON A. LATTAL, MARK P. REILLY, AND JAMES P. KOHN

RESPONSE PERSISTENCE UNDER RATIO AND INTERVAL REINFORCEMENT SCHEDULES KENNON A. LATTAL, MARK P. REILLY, AND JAMES P. KOHN JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1998, 70, 165 183 NUMBER 2(SEPTEMBER) RESPONSE PERSISTENCE UNDER RATIO AND INTERVAL REINFORCEMENT SCHEDULES KENNON A. LATTAL, MARK P. REILLY, AND JAMES

More information

SECOND-ORDER SCHEDULES: BRIEF SHOCK AT THE COMPLETION OF EACH COMPONENT'

SECOND-ORDER SCHEDULES: BRIEF SHOCK AT THE COMPLETION OF EACH COMPONENT' JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR SECOND-ORDER SCHEDULES: BRIEF SHOCK AT THE COMPLETION OF EACH COMPONENT' D. ALAN STUBBS AND PHILIP J. SILVERMAN UNIVERSITY OF MAINE, ORONO AND WORCESTER

More information

postreinforcement pause for a minute or two at the beginning of the session. No reduction

postreinforcement pause for a minute or two at the beginning of the session. No reduction PUNISHMENT A ND RECO VER Y D URING FIXED-RA TIO PERFORMA NCE' NATHAN H. AZRIN2 ANNA STATE HOSPITAL When a reinforcement is delivered according to a fixed-ratio schedule, it has been found that responding

More information

Phil Reed. Learn Behav (2011) 39:27 35 DOI /s Published online: 24 September 2010 # Psychonomic Society 2010

Phil Reed. Learn Behav (2011) 39:27 35 DOI /s Published online: 24 September 2010 # Psychonomic Society 2010 Learn Behav (211) 39:27 35 DOI 1.17/s1342-1-3-5 Effects of response-independent stimuli on fixed-interval and fixed-ratio performance of rats: a model for stressful disruption of cyclical eating patterns

More information

REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER

REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER UNIVERSITY OF IOWA 1969, 12, 451-461 NUMBER

More information

STIMULUS FUNCTIONS IN TOKEN-REINFORCEMENT SCHEDULES CHRISTOPHER E. BULLOCK

STIMULUS FUNCTIONS IN TOKEN-REINFORCEMENT SCHEDULES CHRISTOPHER E. BULLOCK STIMULUS FUNCTIONS IN TOKEN-REINFORCEMENT SCHEDULES By CHRISTOPHER E. BULLOCK A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR

More information

CAROL 0. ECKERMAN UNIVERSITY OF NORTH CAROLINA. in which stimulus control developed was studied; of subjects differing in the probability value

CAROL 0. ECKERMAN UNIVERSITY OF NORTH CAROLINA. in which stimulus control developed was studied; of subjects differing in the probability value JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1969, 12, 551-559 NUMBER 4 (JULY) PROBABILITY OF REINFORCEMENT AND THE DEVELOPMENT OF STIMULUS CONTROL' CAROL 0. ECKERMAN UNIVERSITY OF NORTH CAROLINA Pigeons

More information

Schedules of Reinforcement

Schedules of Reinforcement Schedules of Reinforcement MACE, PRATT, ZANGRILLO & STEEGE (2011) FISHER, PIAZZA & ROANE CH 4 Rules that describe how will be reinforced are 1. Every response gets SR+ ( ) vs where each response gets 0

More information

THE EFFECTS OF TERMINAL-LINK STIMULUS ARRANGEMENTS ON PREFERENCE IN CONCURRENT CHAINS. LAUREL COLTON and JAY MOORE University of Wisconsin-Milwaukee

THE EFFECTS OF TERMINAL-LINK STIMULUS ARRANGEMENTS ON PREFERENCE IN CONCURRENT CHAINS. LAUREL COLTON and JAY MOORE University of Wisconsin-Milwaukee The Psychological Record, 1997,47,145-166 THE EFFECTS OF TERMINAL-LINK STIMULUS ARRANGEMENTS ON PREFERENCE IN CONCURRENT CHAINS LAUREL COLTON and JAY MOORE University of Wisconsin-Milwaukee Pigeons served

More information

CS DURATION' UNIVERSITY OF CHICAGO. in response suppression (Meltzer and Brahlek, with bananas. MH to S. P. Grossman. The authors wish to

CS DURATION' UNIVERSITY OF CHICAGO. in response suppression (Meltzer and Brahlek, with bananas. MH to S. P. Grossman. The authors wish to JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1971, 15, 243-247 NUMBER 2 (MARCH) POSITIVE CONDITIONED SUPPRESSION: EFFECTS OF CS DURATION' KLAUS A. MICZEK AND SEBASTIAN P. GROSSMAN UNIVERSITY OF CHICAGO

More information

KEY PECKING IN PIGEONS PRODUCED BY PAIRING KEYLIGHT WITH INACCESSIBLE GRAIN'

KEY PECKING IN PIGEONS PRODUCED BY PAIRING KEYLIGHT WITH INACCESSIBLE GRAIN' JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1975, 23, 199-206 NUMBER 2 (march) KEY PECKING IN PIGEONS PRODUCED BY PAIRING KEYLIGHT WITH INACCESSIBLE GRAIN' THOMAS R. ZENTALL AND DAVID E. HOGAN UNIVERSITY

More information

STUDIES OF WHEEL-RUNNING REINFORCEMENT: PARAMETERS OF HERRNSTEIN S (1970) RESPONSE-STRENGTH EQUATION VARY WITH SCHEDULE ORDER TERRY W.

STUDIES OF WHEEL-RUNNING REINFORCEMENT: PARAMETERS OF HERRNSTEIN S (1970) RESPONSE-STRENGTH EQUATION VARY WITH SCHEDULE ORDER TERRY W. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2000, 73, 319 331 NUMBER 3(MAY) STUDIES OF WHEEL-RUNNING REINFORCEMENT: PARAMETERS OF HERRNSTEIN S (1970) RESPONSE-STRENGTH EQUATION VARY WITH SCHEDULE

More information

an ability that has been acquired by training (process) acquisition aversive conditioning behavior modification biological preparedness

an ability that has been acquired by training (process) acquisition aversive conditioning behavior modification biological preparedness acquisition an ability that has been acquired by training (process) aversive conditioning A type of counterconditioning that associates an unpleasant state (such as nausea) with an unwanted behavior (such

More information

Behavioural Processes

Behavioural Processes Behavioural Processes 89 (2012) 212 218 Contents lists available at SciVerse ScienceDirect Behavioural Processes j o ur nal homep age : www.elsevier.com/locate/behavproc Providing a reinforcement history

More information

1. A type of learning in which behavior is strengthened if followed by a reinforcer or diminished if followed by a punisher.

1. A type of learning in which behavior is strengthened if followed by a reinforcer or diminished if followed by a punisher. 1. A stimulus change that increases the future frequency of behavior that immediately precedes it. 2. In operant conditioning, a reinforcement schedule that reinforces a response only after a specified

More information

UNIVERSITY OF WALES SWANSEA AND WEST VIRGINIA UNIVERSITY

UNIVERSITY OF WALES SWANSEA AND WEST VIRGINIA UNIVERSITY JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 05, 3, 3 45 NUMBER (JANUARY) WITHIN-SUBJECT TESTING OF THE SIGNALED-REINFORCEMENT EFFECT ON OPERANT RESPONDING AS MEASURED BY RESPONSE RATE AND RESISTANCE

More information

Travel Distance and Stimulus Duration on Observing Responses by Rats

Travel Distance and Stimulus Duration on Observing Responses by Rats EUROPEAN JOURNAL OF BEHAVIOR ANALYSIS 2010, 11, 79-91 NUMBER 1 (SUMMER 2010) 79 Travel Distance and Stimulus Duration on Observing Responses by Rats Rogelio Escobar National Autonomous University of Mexico

More information

CONTINGENT MAGNITUDE OF REWARD IN A HUMAN OPERANT IRT>15-S-LH SCHEDULE. LOUIS G. LIPPMAN and LYLE E. LERITZ Western Washington University

CONTINGENT MAGNITUDE OF REWARD IN A HUMAN OPERANT IRT>15-S-LH SCHEDULE. LOUIS G. LIPPMAN and LYLE E. LERITZ Western Washington University The Psychological Record, 2002, 52, 89-98 CONTINGENT MAGNITUDE OF REWARD IN A HUMAN OPERANT IRT>15-S-LH SCHEDULE LOUIS G. LIPPMAN and LYLE E. LERITZ Western Washington University In an IRT>15-s schedule,

More information

Operant Conditioning B.F. SKINNER

Operant Conditioning B.F. SKINNER Operant Conditioning B.F. SKINNER Reinforcement in Operant Conditioning Behavior Consequence Patronize Elmo s Diner It s all a matter of consequences. Rewarding Stimulus Presented Tendency to tell jokes

More information

The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand).

The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). http://researchcommons.waikato.ac.nz/ Research Commons at the University of Waikato Copyright Statement: The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). The thesis

More information

CONTINGENCY RELATIONS: THE ROLE OF

CONTINGENCY RELATIONS: THE ROLE OF JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR COLLEGE STUDENTS' RESPONDING TO AND RATING OF CONTINGENCY RELATIONS: THE ROLE OF TEMPORAL CONTIGUITY EDWARD A. WASSERMAN AND DANNY J. NEUNABER THE UNIVERSITY

More information

Schedule Induced Polydipsia: Effects of Inter-Food Interval on Access to Water as a Reinforcer

Schedule Induced Polydipsia: Effects of Inter-Food Interval on Access to Water as a Reinforcer Western Michigan University ScholarWorks at WMU Master's Theses Graduate College 8-1974 Schedule Induced Polydipsia: Effects of Inter-Food Interval on Access to Water as a Reinforcer Richard H. Weiss Western

More information

Examining the Constant Difference Effect in a Concurrent Chains Procedure

Examining the Constant Difference Effect in a Concurrent Chains Procedure University of Wisconsin Milwaukee UWM Digital Commons Theses and Dissertations May 2015 Examining the Constant Difference Effect in a Concurrent Chains Procedure Carrie Suzanne Prentice University of Wisconsin-Milwaukee

More information

EFFECTS OF INTERRESPONSE-TIME SHAPING ON MULTIPLE SCHEDULE PERFORMANCE. RAFAEL BEJARANO University of Kansas

EFFECTS OF INTERRESPONSE-TIME SHAPING ON MULTIPLE SCHEDULE PERFORMANCE. RAFAEL BEJARANO University of Kansas The Psychological Record, 2004, 54, 479-490 EFFECTS OF INTERRESPONSE-TIME SHAPING ON MULTIPLE SCHEDULE PERFORMANCE RAFAEL BEJARANO University of Kansas The experiment reported herein was conducted to determine

More information

Within-event learning contributes to value transfer in simultaneous instrumental discriminations by pigeons

Within-event learning contributes to value transfer in simultaneous instrumental discriminations by pigeons Animal Learning & Behavior 1999, 27 (2), 206-210 Within-event learning contributes to value transfer in simultaneous instrumental discriminations by pigeons BRIGETTE R. DORRANCE and THOMAS R. ZENTALL University

More information

Operant response topographies of rats receiving food or water reinforcers on FR or FI reinforcement schedules

Operant response topographies of rats receiving food or water reinforcers on FR or FI reinforcement schedules Animal Learning& Behavior 1981,9 (3),406-410 Operant response topographies of rats receiving food or water reinforcers on FR or FI reinforcement schedules JOHN H. HULL, TIMOTHY J. BARTLETT, and ROBERT

More information

Signaled reinforcement effects on fixed-interval performance of rats with lever depressing or releasing as a target response 1

Signaled reinforcement effects on fixed-interval performance of rats with lever depressing or releasing as a target response 1 Japanese Psychological Research 1998, Volume 40, No. 2, 104 110 Short Report Signaled reinforcement effects on fixed-interval performance of rats with lever depressing or releasing as a target response

More information

EFFECTS OF A LIMITED HOLD ON PIGEONS MATCH-TO-SAMPLE PERFORMANCE UNDER FIXED-RATIO SCHEDULING. Joseph Leland Cermak, B.A.

EFFECTS OF A LIMITED HOLD ON PIGEONS MATCH-TO-SAMPLE PERFORMANCE UNDER FIXED-RATIO SCHEDULING. Joseph Leland Cermak, B.A. EFFECTS OF A LIMITED HOLD ON PIGEONS MATCH-TO-SAMPLE PERFORMANCE UNDER FIXED-RATIO SCHEDULING Joseph Leland Cermak, B.A. Thesis Prepared for the Degree of MASTER OF SCIENCE UNIVERSITY OF NORTH TEXAS December

More information

acquisition associative learning behaviorism A type of learning in which one learns to link two or more stimuli and anticipate events

acquisition associative learning behaviorism A type of learning in which one learns to link two or more stimuli and anticipate events acquisition associative learning In classical conditioning, the initial stage, when one links a neutral stimulus and an unconditioned stimulus so that the neutral stimulus begins triggering the conditioned

More information

Learning. Learning is a relatively permanent change in behavior acquired through experience or practice.

Learning. Learning is a relatively permanent change in behavior acquired through experience or practice. Learning Learning is a relatively permanent change in behavior acquired through experience or practice. What is Learning? Learning is the process that allows us to adapt (be flexible) to the changing conditions

More information

Extinction of the Context and Latent Inhibition

Extinction of the Context and Latent Inhibition LEARNING AND MOTIVATION 13, 391-416 (1982) Extinction of the Context and Latent Inhibition A. G. BAKER AND PIERRE MERCIER McGill University The hypothesis that latent inhibition could be reduced by extinguishing

More information

CRF or an Fl 5 min schedule. They found no. of S presentation. Although more responses. might occur under an Fl 5 min than under a

CRF or an Fl 5 min schedule. They found no. of S presentation. Although more responses. might occur under an Fl 5 min than under a JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR VOLUME 5, NUMBF- 4 OCITOBER, 1 962 THE EFECT OF TWO SCHEDULES OF PRIMARY AND CONDITIONED REINFORCEMENT JOAN G. STEVENSON1 AND T. W. REESE MOUNT HOLYOKE

More information

Functionality. A Case For Teaching Functional Skills 4/8/17. Teaching skills that make sense

Functionality. A Case For Teaching Functional Skills 4/8/17. Teaching skills that make sense Functionality Teaching skills that make sense Mary Jane Weiss, Ph.D., BCBA-D Eden Princeton Lecture Series April, 2017 A Case For Teaching Functional Skills Preston Lewis, Dec. 1987, TASH Newsletter excerpt

More information

Human Schedule Performance with Hypothetical Monetary Reinforcement

Human Schedule Performance with Hypothetical Monetary Reinforcement EUROPEAN JOURNAL OF BEHAVIOR ANALYSIS 2001, 2, 225-234 NUMBER 2 (WINTER 2001)225 Human Schedule Performance with Hypothetical Monetary Reinforcement University College London, UK Experiments examined the

More information

Measures of temporal discrimination in fixedinterval performance: A case study in archiving data

Measures of temporal discrimination in fixedinterval performance: A case study in archiving data Behavior Research Methods, Instruments, & Computers 2004, 36 (4), 661 669 Measures of temporal discrimination in fixedinterval performance: A case study in archiving data PAULO GUILHARDI and RUSSELL M.

More information

Chapter 6/9: Learning

Chapter 6/9: Learning Chapter 6/9: Learning Learning A relatively durable change in behavior or knowledge that is due to experience. The acquisition of knowledge, skills, and behavior through reinforcement, modeling and natural

More information

Behavioral Contrast: A New Solution to an Old Problem

Behavioral Contrast: A New Solution to an Old Problem Illinois Wesleyan University Digital Commons @ IWU Honors Projects Psychology 2000 Behavioral Contrast: A New Solution to an Old Problem Sara J. Estle '00 Illinois Wesleyan University Recommended Citation

More information

OBSERVING RESPONSES AND SERIAL STIMULI: SEARCHING FOR THE REINFORCING PROPERTIES OF THE S2 ROGELIO ESCOBAR AND CARLOS A. BRUNER

OBSERVING RESPONSES AND SERIAL STIMULI: SEARCHING FOR THE REINFORCING PROPERTIES OF THE S2 ROGELIO ESCOBAR AND CARLOS A. BRUNER JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2009, 92, 215 231 NUMBER 2(SEPTEMBER) OBSERVING RESPONSES AND SERIAL STIMULI: SEARCHING FOR THE REINFORCING PROPERTIES OF THE S2 ROGELIO ESCOBAR AND CARLOS

More information

on both components of conc Fl Fl schedules, c and a were again less than 1.0. FI schedule when these were arranged concurrently.

on both components of conc Fl Fl schedules, c and a were again less than 1.0. FI schedule when these were arranged concurrently. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1975, 24, 191-197 NUMBER 2 (SEPTEMBER) PERFORMANCE IN CONCURRENT INTERVAL SCHEDULES: A SYSTEMATIC REPLICATION' BRENDA LOBB AND M. C. DAVISON UNIVERSITY

More information

The Application of the Species Specific Defense Reaction Hypothesis to Free Operant Avoidance

The Application of the Species Specific Defense Reaction Hypothesis to Free Operant Avoidance Western Michigan University ScholarWorks at WMU Master's Theses Graduate College 8-1972 The Application of the Species Specific Defense Reaction Hypothesis to Free Operant Avoidance Deborah Ann Cory Western

More information

The effect of sample duration and cue on a double temporal discrimination q

The effect of sample duration and cue on a double temporal discrimination q Available online at www.sciencedirect.com Learning and Motivation 39 (2008) 71 94 www.elsevier.com/locate/l&m The effect of sample duration and cue on a double temporal discrimination q Luís Oliveira,

More information

Chapter 5: Learning and Behavior Learning How Learning is Studied Ivan Pavlov Edward Thorndike eliciting stimulus emitted

Chapter 5: Learning and Behavior Learning How Learning is Studied Ivan Pavlov Edward Thorndike eliciting stimulus emitted Chapter 5: Learning and Behavior A. Learning-long lasting changes in the environmental guidance of behavior as a result of experience B. Learning emphasizes the fact that individual environments also play

More information

RESURGENCE OF INTEGRATED BEHAVIORAL UNITS

RESURGENCE OF INTEGRATED BEHAVIORAL UNITS JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2007, 87, 5 24 NUMBER 1(JANUARY) RESURGENCE OF INTEGRATED BEHAVIORAL UNITS GUSTAVO BACHÁ-MÉNDEZ 1,ALLISTON K. REID 2, AND ADELA MENDOZA-SOYLOVNA 1 1 FACULTAD

More information

Occasion Setting without Feature-Positive Discrimination Training

Occasion Setting without Feature-Positive Discrimination Training LEARNING AND MOTIVATION 23, 343-367 (1992) Occasion Setting without Feature-Positive Discrimination Training CHARLOTTE BONARDI University of York, York, United Kingdom In four experiments rats received

More information

CONTROL OF IMPULSIVE CHOICE THROUGH BIASING INSTRUCTIONS. DOUGLAS J. NAVARICK California State University, Fullerton

CONTROL OF IMPULSIVE CHOICE THROUGH BIASING INSTRUCTIONS. DOUGLAS J. NAVARICK California State University, Fullerton The Psychological Record, 2001, 51, 549-560 CONTROL OF IMPULSIVE CHOICE THROUGH BIASING INSTRUCTIONS DOUGLAS J. NAVARICK California State University, Fullerton College students repeatedly chose between

More information

RECALL OF PAIRED-ASSOCIATES AS A FUNCTION OF OVERT AND COVERT REHEARSAL PROCEDURES TECHNICAL REPORT NO. 114 PSYCHOLOGY SERIES

RECALL OF PAIRED-ASSOCIATES AS A FUNCTION OF OVERT AND COVERT REHEARSAL PROCEDURES TECHNICAL REPORT NO. 114 PSYCHOLOGY SERIES RECALL OF PAIRED-ASSOCIATES AS A FUNCTION OF OVERT AND COVERT REHEARSAL PROCEDURES by John W. Brelsford, Jr. and Richard C. Atkinson TECHNICAL REPORT NO. 114 July 21, 1967 PSYCHOLOGY SERIES!, Reproduction

More information

PSYC2010: Brain and Behaviour

PSYC2010: Brain and Behaviour PSYC2010: Brain and Behaviour PSYC2010 Notes Textbook used Week 1-3: Bouton, M.E. (2016). Learning and Behavior: A Contemporary Synthesis. 2nd Ed. Sinauer Week 4-6: Rieger, E. (Ed.) (2014) Abnormal Psychology:

More information

acquisition associative learning behaviorism B. F. Skinner biofeedback

acquisition associative learning behaviorism B. F. Skinner biofeedback acquisition associative learning in classical conditioning the initial stage when one links a neutral stimulus and an unconditioned stimulus so that the neutral stimulus begins triggering the conditioned

More information

Delayed Matching-To-Sample Test in Macaques

Delayed Matching-To-Sample Test in Macaques C O N F I D E N T I A L Delayed Matching-To-Sample Test in Macaques DATE This study was conducted under the terms of a Materials Transfer and Services Agreement between NeuroDetective International and

More information

Acquisition of bar-pressing under interval and ratio-schedules in the presence and absence of stimuli correlated with water delivery

Acquisition of bar-pressing under interval and ratio-schedules in the presence and absence of stimuli correlated with water delivery EUROPEAN JOURNAL OF BEHAVIOR ANALYSIS 2009, 10, 19-29 NUMBER 1 (SUMMER 2009) 19 Acquisition of bar-pressing under interval and ratio-schedules in the presence and absence of stimuli correlated with water

More information

Learning. Learning is a relatively permanent change in behavior acquired through experience.

Learning. Learning is a relatively permanent change in behavior acquired through experience. Learning Learning is a relatively permanent change in behavior acquired through experience. Classical Conditioning Learning through Association Ivan Pavlov discovered the form of learning called Classical

More information

VERNON L. QUINSEY DALHOUSIE UNIVERSITY. in the two conditions. If this were possible, well understood where the criterion response is

VERNON L. QUINSEY DALHOUSIE UNIVERSITY. in the two conditions. If this were possible, well understood where the criterion response is JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR LICK-SHOCK CONTINGENCIES IN THE RATT1 VERNON L. QUINSEY DALHOUSIE UNIVERSITY 1972, 17, 119-125 NUMBER I (JANUARY) Hungry rats were allowed to lick an 8%

More information

Cronfa - Swansea University Open Access Repository

Cronfa - Swansea University Open Access Repository Cronfa - Swansea University Open Access Repository This is an author produced version of a paper published in : Learning & Behavior Cronfa URL for this paper: http://cronfa.swan.ac.uk/record/cronfa31054

More information

Classical Conditioning Classical Conditioning - a type of learning in which one learns to link two stimuli and anticipate events.

Classical Conditioning Classical Conditioning - a type of learning in which one learns to link two stimuli and anticipate events. Classical Conditioning Classical Conditioning - a type of learning in which one learns to link two stimuli and anticipate events. behaviorism - the view that psychology (1) should be an objective science

More information

Association. Operant Conditioning. Classical or Pavlovian Conditioning. Learning to associate two events. We learn to. associate two stimuli

Association. Operant Conditioning. Classical or Pavlovian Conditioning. Learning to associate two events. We learn to. associate two stimuli Myers PSYCHOLOGY (7th Ed) Chapter 8 Learning James A. McCubbin, PhD Clemson University Worth Publishers Learning Learning relatively permanent change in an organism s behavior due to experience Association

More information

Chapter 5: How Do We Learn?

Chapter 5: How Do We Learn? Chapter 5: How Do We Learn? Defining Learning A relatively permanent change in behavior or the potential for behavior that results from experience Results from many life experiences, not just structured

More information

Value transfer in a simultaneous discrimination by pigeons: The value of the S + is not specific to the simultaneous discrimination context

Value transfer in a simultaneous discrimination by pigeons: The value of the S + is not specific to the simultaneous discrimination context Animal Learning & Behavior 1998, 26 (3), 257 263 Value transfer in a simultaneous discrimination by pigeons: The value of the S + is not specific to the simultaneous discrimination context BRIGETTE R.

More information

The influence ofbrief stimuli uncorrelated with reinforcement on choice between variable-ratio schedules

The influence ofbrief stimuli uncorrelated with reinforcement on choice between variable-ratio schedules Animal Learning & Behavior /993. 2/ (2). /59-/67 The influence ofbrief stimuli uncorrelated with reinforcement on choice between variable-ratio schedules PHIL REED and VAL SZZUDLO University ollege London

More information

Learning. Association. Association. Unit 6: Learning. Learning. Classical or Pavlovian Conditioning. Different Types of Learning

Learning. Association. Association. Unit 6: Learning. Learning. Classical or Pavlovian Conditioning. Different Types of Learning Unit 6: Learning Learning Learning relatively permanent change in an organism s behavior due to experience experience (nurture) is the key to learning Different Types of Learning Classical -learn by association

More information

between successive DMTS choice phases.

between successive DMTS choice phases. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1996, 66, 231 242 NUMBER 2(SEPTEMBER) SEPARATING THE EFFECTS OF TRIAL-SPECIFIC AND AVERAGE SAMPLE-STIMULUS DURATION IN DELAYED MATCHING TO SAMPLE IN PIGEONS

More information

Some Parameters of the Second-Order Conditioning of Fear in Rats

Some Parameters of the Second-Order Conditioning of Fear in Rats University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Papers in Behavior and Biological Sciences Papers in the Biological Sciences 1969 Some Parameters of the Second-Order Conditioning

More information

What is Learned? Lecture 9

What is Learned? Lecture 9 What is Learned? Lecture 9 1 Classical and Instrumental Conditioning Compared Classical Reinforcement Not Contingent on Behavior Behavior Elicited by US Involuntary Response (Reflex) Few Conditionable

More information

REINFORCEMENT OF PROBE RESPONSES AND ACQUISITION OF STIMULUS CONTROL IN FADING PROCEDURES

REINFORCEMENT OF PROBE RESPONSES AND ACQUISITION OF STIMULUS CONTROL IN FADING PROCEDURES JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1985, 439 235-241 NUMBER 2 (MARCH) REINFORCEMENT OF PROBE RESPONSES AND ACQUISITION OF STIMULUS CONTROL IN FADING PROCEDURES LANNY FIELDS THE COLLEGE OF

More information

PIGEONS CHOICES BETWEEN FIXED-RATIO AND LINEAR OR GEOMETRIC ESCALATING SCHEDULES PAUL NEUMAN, WILLIAM H. AHEARN, AND PHILIP N.

PIGEONS CHOICES BETWEEN FIXED-RATIO AND LINEAR OR GEOMETRIC ESCALATING SCHEDULES PAUL NEUMAN, WILLIAM H. AHEARN, AND PHILIP N. JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 2000, 73, 93 102 NUMBER 1(JANUARY) PIGEONS CHOICES BETWEEN FIXED-RATIO AND LINEAR OR GEOMETRIC ESCALATING SCHEDULES PAUL NEUMAN, WILLIAM H. AHEARN, AND

More information

INDUCED POLYDIPSIA1 JOHN L. FALK UNIVERSITY OF MICHIGAN. a large number of these small meals. Under. CRF, inter-reinforcement time is short and

INDUCED POLYDIPSIA1 JOHN L. FALK UNIVERSITY OF MICHIGAN. a large number of these small meals. Under. CRF, inter-reinforcement time is short and JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR VOLUME 9, NUMBER I JANUARY, 1966 THE MOTIVATIONAL PROPERTIES OF SCHEDULE- INDUCED POLYDIPSIA1 JOHN L. FALK UNIVERSITY OF MICHIGAN Schedule-induced polydipsia

More information

Observing behavior: Redundant stimuli and time since information

Observing behavior: Redundant stimuli and time since information Animal Learning & Behavior 1978,6 (4),380-384 Copyright 1978 by The Psychonornic Society, nc. Observing behavior: Redundant stimuli and time since information BRUCE A. WALD Utah State University, Logan,

More information

Jennifer J. McComas and Ellie C. Hartman. Angel Jimenez

Jennifer J. McComas and Ellie C. Hartman. Angel Jimenez The Psychological Record, 28, 58, 57 528 Some Effects of Magnitude of Reinforcement on Persistence of Responding Jennifer J. McComas and Ellie C. Hartman The University of Minnesota Angel Jimenez The University

More information

The Post-Reinforcement Pause and Terminal Rate In Fixed-Interval Schedules

The Post-Reinforcement Pause and Terminal Rate In Fixed-Interval Schedules Utah State University DigitalCommons@USU All Graduate Theses and Dissertations Graduate Studies 5-1971 The Post-Reinforcement Pause and Terminal Rate n Fixed-nterval Schedules Charles A. Lund Utah State

More information

Comparison of Direct and Indirect Reinforcement Contingencies on Task Acquisition. A Thesis Presented. Robert Mark Grant

Comparison of Direct and Indirect Reinforcement Contingencies on Task Acquisition. A Thesis Presented. Robert Mark Grant Comparison of Direct and Indirect Reinforcement Contingencies on Task Acquisition A Thesis Presented By Robert Mark Grant In partial fulfillment of the requirements for the degree of Master of Science

More information

Unit 6 Learning.

Unit 6 Learning. Unit 6 Learning https://www.apstudynotes.org/psychology/outlines/chapter-6-learning/ 1. Overview 1. Learning 1. A long lasting change in behavior resulting from experience 2. Classical Conditioning 1.

More information

Teaching Classical and Operant Conditioning in a Laboratory-Based Course: Eight Effective Experiments

Teaching Classical and Operant Conditioning in a Laboratory-Based Course: Eight Effective Experiments Teaching Classical and Operant Conditioning in a Laboratory-Based Course: Eight Effective Experiments Eric S. Murphy & Robert J. Madigan University of Alaska Anchorage We teach a junior-level experimental

More information

CURRENT RESEARCH IN SOCIAL PSYCHOLOGY

CURRENT RESEARCH IN SOCIAL PSYCHOLOGY CURRENT RESEARCH IN SOCIAL PSYCHOLOGY Volume 6, Number 2 Submitted: September 8, 2000 Resubmitted: January 22, 2001 Accepted: January 23, 2001 Publication date: January 26, 2001 THE EFFICACY OF REINFORCEMENT

More information

PREFERENCE REVERSALS WITH FOOD AND WATER REINFORCERS IN RATS LEONARD GREEN AND SARA J. ESTLE V /V (A /A )(D /D ), (1)

PREFERENCE REVERSALS WITH FOOD AND WATER REINFORCERS IN RATS LEONARD GREEN AND SARA J. ESTLE V /V (A /A )(D /D ), (1) JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 23, 79, 233 242 NUMBER 2(MARCH) PREFERENCE REVERSALS WITH FOOD AND WATER REINFORCERS IN RATS LEONARD GREEN AND SARA J. ESTLE WASHINGTON UNIVERSITY Rats

More information

Learning Habituation Associative learning Classical conditioning Operant conditioning Observational learning. Classical Conditioning Introduction

Learning Habituation Associative learning Classical conditioning Operant conditioning Observational learning. Classical Conditioning Introduction 1 2 3 4 5 Myers Psychology for AP* Unit 6: Learning Unit Overview How Do We Learn? Classical Conditioning Operant Conditioning Learning by Observation How Do We Learn? Introduction Learning Habituation

More information

Myers PSYCHOLOGY. (7th Ed) Chapter 8. Learning. James A. McCubbin, PhD Clemson University. Worth Publishers

Myers PSYCHOLOGY. (7th Ed) Chapter 8. Learning. James A. McCubbin, PhD Clemson University. Worth Publishers Myers PSYCHOLOGY (7th Ed) Chapter 8 Learning James A. McCubbin, PhD Clemson University Worth Publishers Learning Learning relatively permanent change in an organism s behavior due to experience Association

More information

Unit 06 - Overview. Click on the any of the above hyperlinks to go to that section in the presentation.

Unit 06 - Overview. Click on the any of the above hyperlinks to go to that section in the presentation. Unit 06 - Overview How We Learn and Classical Conditioning Operant Conditioning Operant Conditioning s Applications, and Comparison to Classical Conditioning Biology, Cognition, and Learning Learning By

More information

PSYC 337 LEARNING. Session 6 Instrumental and Operant Conditioning Part Two

PSYC 337 LEARNING. Session 6 Instrumental and Operant Conditioning Part Two PSYC 337 LEARNING Session 6 Instrumental and Operant Conditioning Part Two Lecturer: Dr. Inusah Abdul-Nasiru Contact Information: iabdul-nasiru@ug.edu.gh College of Education School of Continuing and Distance

More information

Determining the Reinforcing Value of Social Consequences and Establishing. Social Consequences as Reinforcers. A Thesis Presented. Hilary A.

Determining the Reinforcing Value of Social Consequences and Establishing. Social Consequences as Reinforcers. A Thesis Presented. Hilary A. Determining the Reinforcing Value of Social Consequences and Establishing Social Consequences as Reinforcers A Thesis Presented by Hilary A. Gibson The Department of Counseling and Applied Educational

More information

NIH Public Access Author Manuscript J Exp Psychol Anim Behav Process. Author manuscript; available in PMC 2005 November 14.

NIH Public Access Author Manuscript J Exp Psychol Anim Behav Process. Author manuscript; available in PMC 2005 November 14. NIH Public Access Author Manuscript Published in final edited form as: J Exp Psychol Anim Behav Process. 2005 April ; 31(2): 213 225. Timing in Choice Experiments Jeremie Jozefowiez and Daniel T. Cerutti

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5871/1849/dc1 Supporting Online Material for Rule Learning by Rats Robin A. Murphy,* Esther Mondragón, Victoria A. Murphy This PDF file includes: *To whom correspondence

More information

RESPONSE-INDEPENDENT CONDITIONED REINFORCEMENT IN AN OBSERVING PROCEDURE

RESPONSE-INDEPENDENT CONDITIONED REINFORCEMENT IN AN OBSERVING PROCEDURE RESPONSE-INDEPENDENT CONDITIONED REINFORCEMENT IN AN OBSERVING PROCEDURE By ANTHONY L. DEFULIO A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE

More information

CORRELATED DELAY OF REINFORCEMENT 1

CORRELATED DELAY OF REINFORCEMENT 1 Journal of Comparative and Physiological Psychology 1961, Vol. 54, No. 2, 196-203 CRRELATED DELAY F REINFRCEMENT 1 It is well established that variations in an operant response can be differentiated by

More information

... CR Response ... UR NR

... CR Response ... UR NR Learning is the (1) brain-based phenomenon that is a (2) relatively permanent change (3) in behavior that results from (4) experience, (5) reinforcement, or (6) observation. (1) brain-based (2) relatively

More information

LEARNING-SET OUTCOME IN SECOND-ORDER CONDITIONAL DISCRIMINATIONS

LEARNING-SET OUTCOME IN SECOND-ORDER CONDITIONAL DISCRIMINATIONS The Psychological Record, 2000, 50, 429-442 LEARNING-SET OUTCOME IN SECOND-ORDER CONDITIONAL DISCRIMINATIONS LUIS A. PEREZ-GONZALEZ, JOSEPH E. SPRADLIN, and KATHRYN J. SAUNDERS University of Oviedo, Spain

More information

SERIAL CONDITIONING AS A FUNCTION OF STIMULUS, RESPONSE, AND TEMPORAL DEPENDENCIES

SERIAL CONDITIONING AS A FUNCTION OF STIMULUS, RESPONSE, AND TEMPORAL DEPENDENCIES University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Faculty Publications, Department of Psychology Psychology, Department of January 199 SERIAL CONDITIONING AS A FUNCTION OF

More information

edited by Derek P. Hendry'

edited by Derek P. Hendry' JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR INFORMATION ON CONDITIONED REINFORCEMENT A review of Conditioned Reinforcement, edited by Derek P. Hendry' LEWIS R. GOLLUB2 UNIVERSITY OF MARYLAND 1970,

More information