REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER

Similar documents
CAROL 0. ECKERMAN UNIVERSITY OF NORTH CAROLINA. in which stimulus control developed was studied; of subjects differing in the probability value

CONDITIONED REINFORCEMENT IN RATS'

REINFORCEMENT OF PROBE RESPONSES AND ACQUISITION OF STIMULUS CONTROL IN FADING PROCEDURES

ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE

SECOND-ORDER SCHEDULES: BRIEF SHOCK AT THE COMPLETION OF EACH COMPONENT'

FIXED-RATIO PUNISHMENT1 N. H. AZRIN,2 W. C. HOLZ,2 AND D. F. HAKE3

Excerpt from LABORATORY MANUAL PRINCIPLES OF PSYCHOLOGY: EXPERIMENTAL FOUNDATIONS PSYCHOLOGY

CRF or an Fl 5 min schedule. They found no. of S presentation. Although more responses. might occur under an Fl 5 min than under a

Attention shifts during matching-to-sample performance in pigeons

KEY PECKING IN PIGEONS PRODUCED BY PAIRING KEYLIGHT WITH INACCESSIBLE GRAIN'

on both components of conc Fl Fl schedules, c and a were again less than 1.0. FI schedule when these were arranged concurrently.

PROBABILITY OF SHOCK IN THE PRESENCE AND ABSENCE OF CS IN FEAR CONDITIONING 1

INTRODUCING NEW STIMULI IN FADING

Value transfer in a simultaneous discrimination by pigeons: The value of the S + is not specific to the simultaneous discrimination context

STIMULUS FUNCTIONS IN TOKEN-REINFORCEMENT SCHEDULES CHRISTOPHER E. BULLOCK

Within-event learning contributes to value transfer in simultaneous instrumental discriminations by pigeons

postreinforcement pause for a minute or two at the beginning of the session. No reduction

CS DURATION' UNIVERSITY OF CHICAGO. in response suppression (Meltzer and Brahlek, with bananas. MH to S. P. Grossman. The authors wish to

Stimulus control of foodcup approach following fixed ratio reinforcement*

ON THE EFFECTS OF EXTENDED SAMPLE-OBSERVING RESPONSE REQUIREMENTS ON ADJUSTED DELAY IN A TITRATING DELAY MATCHING-TO-SAMPLE PROCEDURE WITH PIGEONS

Instrumental Conditioning I

that simple discrimination training to compound given a set to react -to one aspect of a stimulus a "set", a discrimination is "pretrained" along

A Memory Model for Decision Processes in Pigeons

Pigeons transfer between conditional discriminations with differential outcomes in the absence of differential-sample-responding cues

The generality of within-session patterns of responding: Rate of reinforcement and session length

RESPONSE PERSISTENCE UNDER RATIO AND INTERVAL REINFORCEMENT SCHEDULES KENNON A. LATTAL, MARK P. REILLY, AND JAMES P. KOHN

Operant response topographies of rats receiving food or water reinforcers on FR or FI reinforcement schedules

EFFECTS OF A LIMITED HOLD ON PIGEONS MATCH-TO-SAMPLE PERFORMANCE UNDER FIXED-RATIO SCHEDULING. Joseph Leland Cermak, B.A.

Transitive inference in pigeons: Control for differential value transfer

THE EFFECTS OF TERMINAL-LINK STIMULUS ARRANGEMENTS ON PREFERENCE IN CONCURRENT CHAINS. LAUREL COLTON and JAY MOORE University of Wisconsin-Milwaukee

Animal memory: The contribution of generalization decrement to delayed conditional discrimination retention functions

Examining the Constant Difference Effect in a Concurrent Chains Procedure

STUDIES OF WHEEL-RUNNING REINFORCEMENT: PARAMETERS OF HERRNSTEIN S (1970) RESPONSE-STRENGTH EQUATION VARY WITH SCHEDULE ORDER TERRY W.

Some Effects of Discrimination Training on a Line Length Dimension

Concurrent schedule responding as a function ofbody weight

VERNON L. QUINSEY DALHOUSIE UNIVERSITY. in the two conditions. If this were possible, well understood where the criterion response is

Some Parameters of the Second-Order Conditioning of Fear in Rats

INTERACTIONS AMONG UNIT PRICE, FIXED-RATIO VALUE, AND DOSING REGIMEN IN DETERMINING EFFECTS OF REPEATED COCAINE ADMINISTRATION

Common Coding in Pigeons Assessed Through Partial Versus Total Reversals of Many-to-One Conditional and Simple Discriminations

J. E. R. STADDON DUKE UNIVERSITY. The relative inability of the usual differential. to ask whether performance under DRL schedules

Variability as an Operant?

REINFORCEMENT AT CONSTANT RELATIVE IMMEDIACY OF REINFORCEMENT A THESIS. Presented to. The Faculty of the Division of Graduate. Studies and Research

Memory Systems Interaction in the Pigeon: Working and Reference Memory

ing the fixed-interval schedule-were observed during the interval of delay. Similarly, Ferster

IVER H. IVERSEN UNIVERSITY OF NORTH FLORIDA. because no special deprivation or home-cage. of other independent variables on operant behavior.

FOREWORD TO SCHEDULES OF REINFORCEMENT W. H. MORSE AND P. B. DEWS

Transfer of Serial Reversal Learning in the Pigeon

Value Transfer in a Simultaneous Discrimination Appears to Result From Within-Event Pavlovian Conditioning

Timing in pigeons: The choose-short effect may result from pigeons confusion between delay and intertrial intervals

The Application of the Species Specific Defense Reaction Hypothesis to Free Operant Avoidance

Jennifer J. McComas and Ellie C. Hartman. Angel Jimenez

Jeremie Jozefowiez. Timely Training School, April 6 th, 2011

Behavioural Processes

Interference in pigeons' long-term memory viewed as a retrieval problem

Publications Blough, D. S. Dark adaptation in the pigeon. Doctoral dissertation, Harvard University, Ratliff, F., & Blough, D. S.

Chapter 6/9: Learning

SERIAL CONDITIONING AS A FUNCTION OF STIMULUS, RESPONSE, AND TEMPORAL DEPENDENCIES

OBSERVING AND ATTENDING IN A DELAYED MATCHING-TO-SAMPLE PREPARATION IN PIGEONS. Bryan S. Lovelace, B.S. Thesis Prepared for the Degree of

ASSOCIATED WITH NALORPHINE IN

DISCRIMINATION IN RATS OSAKA CITY UNIVERSITY. to emit the response in question. Within this. in the way of presenting the enabling stimulus.

Reports from the Research Laboratories

Predictive Accuracy and the Effects of Partial Reinforcement on Serial Autoshaping

Operant Conditioning B.F. SKINNER

Birds' Judgments of Number and Quantity

STEPHEN P. KRAMER. (Kojima, 1980; Lattal, 1975; Maki, Moe, &

Stimulus control of topographically tagged responding

Oddity learning in the pigeon: Effect of negative instances, correction, and number of incorrect alternatives

Classical Conditioning Classical Conditioning - a type of learning in which one learns to link two stimuli and anticipate events.

Reinforced variation and selection

Acquisition of bar-pressing under interval and ratio-schedules in the presence and absence of stimuli correlated with water delivery

between successive DMTS choice phases.

Subjects and apparatus

PSYC2010: Brain and Behaviour

Transfer Across Delayed Discriminations: II. Differences in the Substitutability of Initial Versus Test Stimuli

Schedule Induced Polydipsia: Effects of Inter-Food Interval on Access to Water as a Reinforcer

Schedules of Reinforcement

Increasing the persistence of a heterogeneous behavior chain: Studies of extinction in a rat model of search behavior of working dogs

AUTOSHAPING OF KEY PECKING IN PIGEONS

ANNA STATE HOSPITAL. peated measurement of the attack. A mechanical

Schedules of Reinforcement 11/11/11

The effects of two different states of food deprivation for 6 roosters was measured with a

Classical and Instrumental Conditioning. Lecture 8

CONTINGENCY VALUES OF VARYING STRENGTH AND COMPLEXITY

Processing of empty and filled time intervals in pigeons

The effects of Pavlovian CSs on two food-reinforced baselineswith and without noncontingent shock

Sum of responding as a function of sum of reinforcement on two-key concurrent schedules

UNIVERSITY OF FLORIDA 1972

MOUNT ALLISON UNIVERSITY

MCMASTER UNIVERSITY. more values independently of food delivery. In any case, the well known result of Skinner's

NEW YORK UNIVERSITY. as in matching-to-sample or oddity performances, the rate of other, concurrently reinforced responses,

Discrimination and Generalization in Pattern Categorization: A Case for Elemental Associative Learning

The effects of reinforcement of the development of inhibitory stimulus control.

Supplementary Materials

AMOUNT OF RESPONSE-PRODUCED CHANGE IN THE CS AND AVOIDANCE LEARNING 1

PERIODIC RESPONSE-REINFORCER CONTIGUITY: TEMPORAL CONTROL BUT NOT AS WE KNOW IT! MICHAEL KEENAN University of Ulster at Coleraine

Travel Distance and Stimulus Duration on Observing Responses by Rats

The effect of sample duration and cue on a double temporal discrimination q

RESPONSE-INDEPENDENT CONDITIONED REINFORCEMENT IN AN OBSERVING PROCEDURE

Classical Conditioning. Learning. Classical conditioning terms. Classical Conditioning Procedure. Procedure, cont. Important concepts

Pigeons and the Magical Number Seven

Transcription:

JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER UNIVERSITY OF IOWA 1969, 12, 451-461 NUMBER 3 (MAY) Two experiments studied the effects of reinforcement schedules on generalization gradients. In Exp. 1, after pigeons' responding to a vertical line was reinforced, the pigeons were tested with 1 lines differing in orientation. Reconditioning and the redetermination of generalization gradients were repeated from 8 to 11 times with the schedule of reinforcement varied in the reconditioning phase. Stable gradients could not be observed because the successive reconditionings and tests steepened the gradients and reduced responding. Experiment 2 overcame these effects by first training the birds to respond to all of the stimuli. Then, brief periods of reinforced responding to the stimulus correlated with reinforcement alternated with the presentation of the 1 lines in extinction. The development of stimulus control was studied eight times with each bird, twice with each of four schedules of reinforcement. Gradients were similar each time a schedule was imposed; the degree of control by the stimulus correlated with reinforcement varied with particular schedules. Behavioral contrast occurred when periods of reinforcement and extinction alternated and was more durable with fixedinterval, variable-interval, and variable-ratio schedules than with fixed-ratio or differentialreinforcement-of-low-rate schedules. Although generalization gradients-the gradients used to depict the relation between environmental stimuli and a quantifiable property of responses-vary with the schedule of reinforcement used in training (Hearst, 1965; Thomas and Switalski, 1966), there is no information as to whyether these influences continue beyond the first test. Data on the relation between schedules and stimulus control have been obtained by giving animals one cycle of training and testing. A most impressive property of reinforcement schedules, however, is their ability to establish characteristic behavior, for example, the temporal distribution of responses, that is largely independent of experimental history (Ferster and Skinner, 1957). The purpose of the present experiments was to investigate whether schedules might also have characteristic and recoverable effects on gradients of stimulus control. Subjects Four White Carneaux pigeons were maintained at 8% of their free-feeding weights 'This research was supported by Research Grants MH 8818 and HD 2845 from the U. S. Public Health Service. Reprints may be obtained from the author, Institute of Child Behavior and Development, University of Iowa, Iowa City, Iowa 5224. After Sept. 1, 1969, the author's address will be: Dept. of Psychology, Emory University, Atlanta, Georgia 3322. throughout the two experiments. Birds 4, 42, and 46 had not been used in other research; Bird 1 had previously acquired a discrimination among three stimuli differing in size. Apparatus The experimental chamber was modified from the standard unit described by Ferster and Skinner (1957). The clear plastic response key (R. Gerbrands Co.), operated by a peck with a minimum force of 12 g, was displayed through a 19-mm diameter hole centered 216 mm above the floor of the chamber. A 51-mm square aperture below the key provided occasional access to mixed grain for 4 sec, and was illuminated by two 6-w clear bulbs when grain was available. The key and aperture lights provided the only sources of illumination. White noise masked extraneous sounds. All phases of the experiment were fully automated. An inline display projector projected 1 different angular orientations of a white line 19 mm long and 5 mm wide on the rear of the response key. The orientations, designated by their distance from vertical with clockwise differences denoted as positive and counterclockwise differences as negative, were (vertical), +18, ±36, +54, +72, and 9 (horizontal). 451

452 MICHAEL D. ZEILER EXPERIMENT 1: GENERALIZATION TESTS SEPARATED BY EXTENSIVE RECONDITIONING The standard procedure for obtaining generalization gradients is to train the subject with one stimulus and then test with a range of stimuli in extinction. Therefore, this procedure was used to try to observe stable effects of reinforcement schedules on the gradients. The birds were trained to respond to one stimulus under a particular schedule, were given generalization tests in extinction, were trained under a different schedule, were tested again, etc. A concern from the outset, however, was generated by data showing that, after pigeons were trained with one stimulus correlated with reinforcement (S+), gradients changed with repeated testing (Blough, 1961; Guttman and Kalish, 1956; Kalish and Haber, 1963). Later tests did not recover earlier performance; instead, control by S+ increased. In the Kalish and Haber study, the tests followed each other with no interpolated retraining with S+. In the other experiments, the birds were retrained between the successive tests, but the period of reconditioning was short: Blough used one 2-hr period, and Guttman and Kalish used three 35-min periods. To try to overcome the failure of later tests to recover earlier behavior, the present experiment retrained pigeons for much longer periods between tests. Procedure In training, the vertical line (S+) appeared for 3-sec periods followed by 1 sec with the key dark. Sessions ended after 6 reinforcements. Training continued for at least 3 sessions or until performance stabilized after 1 sessions; the stabilization criterion involved total session time on ratio schedules and total number of responses per session on interval schedules, the values for each of three successive days having to be within 1% of the other two days. On the day after performance stabilized, 3 reinforcements were delivered for responses to S+ according to the schedule used in the preceding session. Then, each subject received seven different sequences of the 1 stimuli with no responses reinforced. The same order of 7 stimulus presentations was used in all generalization tests. Each stimulus appeared for 3 sec with the key dark for 1 sec between stimuli. Reconditioning with S+ began on the day after the test, and continued for either at least 3 days or until behavior met the stability criterion. Then, there was another test day. The entire procedure of training to a stability criterion and testing in extinction was repeated 11 times for Bird 1, 1 times for Birds 4 and 46, and eight times for Bird 42. Reinforcement schedules changed for each reconditioning: the schedule and the number of days with each is shown in Table 1. The schedules were fixed ratio (FR): reinforcement followed a fixed number of responses; variable ratio (VR): reinforcement followed an irregular number of responses with the mean number specified by the VR value; fixed interval (FI): reinforcement followed the first response occurring after the period of time specified by the Fl value; variable interval (VI): reinforcement followed the first response after irregular Table 1 Sequence of Schedules: Experiment 1 Bird 1 Bird 4 Bird 42 Bird 46 Training Training Training Training Schedule Schedule Days Schedule Days Schedule Days Schedule Days 1 FR 6 18 VR 6 3 VI 1-min 24 FI 1-min 3 2 FI 1-min 24 FR 6 17 FlI -min 22 VR 6 21 3 VI 1-min 19 VI 1-min 2 FR 6 18 VI 3-min 18 4 VR6 18 FI l-min 15 VR 6 21 FR 6 37 5 FI l-min 2 VR 6 13 FI l-min 3 VR 6 17 6 Fl 5-min 36 FR 6 18 VI 1-min 15 FI 1-min 24 7 FR 15 2 FR 6 21 VR 6 2 FR 6 19 8 Fl 5-min 19 FI 2-min 3 FR 6 23 VI 1-min 21 9 FI 5-min 3 VR 6 16 Fl 1-min 28 1 FR 15 18 VI l-min 18 VR 6 2 11 FI 5-min 22

6 BIRDU ~~~~BIRD 42 SCHEDULE AND STIMULUS CONTROL 453 times with the mean interval specified by the The percentage of total responses to S+ VI value. Responses and time when the key measured stimulus control and compensated was dark did not count in meeting the schedule requirements, and intervals were timed ing. As Fig. 2 shows, responding to S+, rela- for differences in the absolute rate of respond- from the end of each magazine cycle. tive to responding to the other stimuli, increased with successive schedules for Birds 4, 42, and 46. Bird 1 showed the same tendency RESULTS for the first five tests, but the sixth, preceded Figure 1 shows that the number of responses by reconditioning with the Fl 5-min schedule, per test declined with successive schedules reversed the trend. The next exposure to Fl (successive reconditionings). The decline reversed with the first two exposures to Fl 5-min, relative responding to S+. Control by S+ 5-min (Schedule 8) reproduced the lowered for Bird 1 (Schedules 6 and 8), the first exposure to VI 1-min for Bird 4 (Schedule 3), to the schedule (Schedules 9 and 11). For Bird increased with the third and fourth exposures and the first two exposures to VR 6 for Bird 1, reversals in the decline of total responses 46 (Schedules 2 and 5). and in the increase in relative responding to Under all conditions, the generalization gradients tended to be asymmetrical with more other birds, there were no reversals of the in- S+ occurred with the same schedule; for the responding to stimuli rotated clockwise from creased control by S+. S+ than to stimuli rotated counterclockwise The patterns of responding during training an equal number of degrees. All gradients corresponded to those typical of the various peaked at S+ except the first for Birds 4 and schedules (Ferster and Skinner, 1957). Fixed 46; in these two instances, the most responses ratio produced a post-reinforcement pause and occurred to stimuli 18 different from S+ with an abrupt shift to a high rate of responding slightly fewer responses to the vertical line. that persisted until the next grain presentation; fixed interval resulted in a post-reinforce- In all gradients, responding progressively decreased with increased differences between the ment pause followed by positively accelerated stimulus controlling the most responses and responding, and variable interval and variable the other lines. ratio produced steady rates of responding. - loc BIRD 46 A m- C 143 So 7 C. 2 75-75- s- 25-25- 13 s 7 9 1 3 5 7 Schedule Number Fig. 1. Number of responses in generalization tests following each schedule. Schedule Number Fig. 2. Percentage of total responses occurring to S+ in tests following each schedule.

45#4 MICHAEL D. ZEILER DISCUSSION The decline in the absolute number of responses and the increase in relative control by S+ with successive tests replicated the results of earlier experiments (Blough, 1961; Guttman and Kalish, 1956; Kalish and Haber, 1963). Since the prolonged reconditioning of the present experiment resulted in behavior similar to that observed without or with brief reconditioning, the cumulative effects of repeated testing on stimulus control appear not to be overcome by extensive retraining with S+ Ȧlthough the schedules determined characteristic rates and patterns of responding, the only evidence for an idiosyncratic schedule effect on stimulus control came from the FI 5-min schedule for Bird 1. The first two applications of this schedule decreased the amount of control and increased the number of responses during the test. However, additional reconditionings failed to maintain these distinctive effects. EXPERIMENT 2: ACQUISITION OF STIMULUS CONTROL WITH DIFFERENT SCHEDULES OF REINFORCEMENT In Exp. 1, the birds' responses to a stimulus other than S+ were not reinforced. The situation may be conceptualized as discrimination training, with S+ being the vertical line whenever the stimulus did not change from one 3-sec interval to the next, and S- being the presentation of a changing stimulus or any stimulus different from S+. The progressive increase in stimulus control and decrease in number of test responses fits this hypothesis. The procedure of Exp. 2 weakened the contrast between conditions. Whenever reinforcement schedules were changed, the birds first received grain for responding to all of the stimuli until they responded equally to each. Next, reinforcement for responses to S+ alternated with extinction periods in which all 1 stimuli were presented without reinforcement. Responding in the extinction periods measured stimulus control. Thus, Exp. 2 differed from Exp. 1 in that stimulus control was eliminated before reconditioning with S+. The new procedure shifted the emphasis from an evaluation of control after a period of exposure to a single stimulus to the progressive development of control by S+, and was used to investigate how different schedules affected the acquisition of a discrimination. Procedure The procedure consisted of two phases. In Phase 1, the stimuli changed every 15-sec and food presentation was independent of the stimulus appearing on the key when the schedule requirements were met. In Phase 2, periods with food available for responses to the vertical line (S+) alternated with the presentation of all of the stimuli with no responses reinforced. The seven sequences of stimuli were the same as those of Exp. 1. In each of the seven sequences the ordering of the 1 stirmuli differed, but the order of stimuli within each sequence and the order of the sequences themselves were fixed. Neither responses nor time were counted during the 1-sec stimulus-off periods that occurred every 15-sec in both phases. The same schedule of reinforcement was used throughout a given training and testing series. The schedules and the number of days of training to respond to all stimuli (Phase 1) are shown in Table 2. Each bird had two exposures to each of four different reinforcement schedules. In addition to the schedules of Exp. 1 (fixedratio, fixed-interval, variable-ratio, variableinterval), this study used differential-reinforcement-of-low-rate schedules (DRL). A DRL schedule specifies the minimum time that must elapse between successive responses for a reinforcement to occur. Phase 1 continued until visual inspection revealed stable patterns of responding and an approximately equal number of responses to the various stimuli. The stability criterion of Exp. 1 was abandoned because it would have been met before recovery of the pre-discrimination baseline of equal responding to all stimuli. Sessions ended after 6 reinforcements. Phase 2 consisted of 15 days of discrimination training. The first day began with reinforced responding to all of the stimuli. When at least 1 reinforcements had been delivered and each stimulus had appeared equally often, this preliminary period ended. Then, the presentation of S+ for 15-sec periods for 1 reinforcements alternated with 15-sec presentations of each of the 1 stimuli with no

SCHEDULE AND STIMULUS CONTROL 455 Table 2 Sequence of Schedules: Experiment 2 Bird 1 Bird 4 Bird 42 Bird 46 Days in Days in Days in Days in Schedule Phase I Schedule Phase I Schedule Phase 1 Schedule Phase I 1. FR 6 18 VR 75 2 DRL 1-sec 18 VI 1-min 25 2. Fl 5-min 32 VI 1-min 24 FR 6 36 FR 6 17 3. DRL 15-sec 16 Fl 2-min 14 Fl 2-min 38 Fl 5-min 18 4. VI 2-min 2 FR 6 19 DRL 1-sec 19 FR 6 22 5. DRL 15-sec 18 VI 1-min 19 VR 6 17 VI 1-min 24 6. Fl 5-min 19 VR 75 23 Fl 2-min 17 VR 6 16 7. FR 6 15 FR 6 18 VR 6 15 FI 5-min 13 8. VI 2-min 14 FI 2-min 23 FR 6 14 VR 6 18 responses reinforced. The day ended with the sixtieth magazine cycle. Thus, on the first day of discrimination training there were 1 to 12 reinforcements delivered independent of stimuli, twenty eight 15-sec presentations of each stimulus with no food available for responses, and 48 to 5 reinforcements for responses to S+ Ȧfter the first day, there was no preliminary period of reinforcement in the presence of all stimuli. Ten magazine cycles for responding to S+ delivered according to the schedule in force alternated with seven presentations of each stimulus in extinction until there were 6 reinforcements. On each day, therefore, there were thirty five 15-sec periods with each of the 1 stimuli with no food available and 6 reinforcements for responding to S+. After the 15 days of discrimination training, responses to all stimuli were again reinforced, and the sequence of procedures was repeated with a different schedule of reinforcement. RESULTS Gradients of Stimulus Control Experiment 1 revealed that responses decreased in successive periods of extinction separated by reconditioning with S+. Because of the strength of this cumulative effect, it was impossible to study whether schedules of reinforcement had stable or consistent influences on gradients of stimulus control. It was essential, therefore, to determine if the present procedure, which also involved repeated reconditioning and extinction between applications of different schedules, eliminated the decrement in responding observed in Exp. 1. An important difference between the two experiments was that Exp. 1 involved reconditioning only with S+, whereas all of the stimuli appeared in Exp. 2. Figure 3 shows the total number of responses during the extinction periods of the first (open bars) and fifteenth (filled bars) days of discrimination training for both applications of each schedule of reinforcement. Comparisons of the first and second applications indicate whether repeated reconditioning and 15-day phases of discrimination training, involving alternated reinforcement for responding to S+ x I I- a 8 I C a 4 BIRD 42 i lb DRLWFR6 Flt VR6 BIRD 46 ~~~~~~~~~~~-1 lor a 2 a 6 4 2 BIRD 1 K DRLI5 F15' V12' FR6 _ BIRD 4 VI I' FR6 FIS' VR6 VR 75 V I' FR6 F12' Schedule Fig. 3. Total responses in the test intervals on Day 1 (open bars) and Day 15 (filled bars). The left bar of each pair is the first and the right bar is the second exposure to the schedule. l

456 MICHAEL D. ZEILER and extinction with all of the stimuli, influenced the number of responses on each day; comparisons of the first and fifteenth days with each schedule reveal the influence of to BIRD 1 BIRD 42 maintained discrimination training without interpolated reconditioning. The amount of responding on each day of extinction was independent of the number of schedules. With BIRD 4 BIRD 46 VI 1'.5 I I L1." A DRL 15'1 a4) 4) VR 6 7 VR 75 I -. VR6 CD cn C') ai) I' a _,O I, cc 4) a1) cr: a1) *CC 1. FR6 FR6 FR6 FR6.5~ ~ ~ P )1-~~ 11.-d ~ ~ ~ ~. ~~~~+,+,+, Stimul'us Fig. 4. Relative gradients of stimulus control for the second exposure to each schedule. The number of responses to each stimulus was divided by the number of responses to S+ (). Open circdes are Day 2, filled circles are Day 15.

SCHEDULE AND STIMULUS CONTROL 457 some schedules there was more responding on the first exposure than on the second (left bar of each pair), and with others there was more responding on the second exposure (right bar of each pair). In contrast, with every application of a schedule, the total responses during the extinction periods progressively declined with maintained training. Successive days of discrimination training decreased the frequency of responding, but number of reconditionings had no orderly cumulative effect. Because successive schedules had no apparent systematic effect on the number of responses during discrimination training, it was possible to study the way different schedules influenced gradients of stimulus control. The changes in responding over days of training, however, required looking at behavior on individual days. Measures of stimulus control based on the absolute number of responses to each line were variable because the rates of responding during extinction differed, although unsystematically, both for the same and different schedules (Fig. 3). Therefore, gradients were derived from relative rather than absolute response frequencies. Figure 4 shows relative gradients for only the second and fifteenth days of the second application of each schedule (the relative gradients for the first and second applications were similar). On Day 2, the gradients of Birds 1 and 42 had less slope with DRL than with any other schedule. For Bird 1, FR produced the steepest gradients, next was VI, then Fl. For Bird 42, Fl and FR were approximately equal and produced more control than did VR. For Bird 4, the steepest gradients were with VI and FR, next was Fl, with the flattest gradients with VR. Bird 46 displayed the steepest gradients with FR and VR; Fl and VI were approximately equal. By Day 15, there was clear control by S+ with all schedules. For Birds 1 and 42, however, there was relatively more responding to other stimuli with DRL than with any other schedule. Although the other three schedules (Fl, VI, FR) produced similar gradients with Bird 1, Fl produced less control than VR and FR with Bird 42. The other birds did not receive DRL, but each showed a schedule effect. 75- DRL 15" BIRD 1 DRL 1o BIRD 42 BIRD 4 w VI i BIRD 46 25-5 'M25 I- 75 C ) 5 t^esffis1 *l~a" V121... ',...-% VR6 F12' VR 75 ' >S ~~~~~~~~~~~~~~~I A..- LA.L... I I FlS' ) a a I I I I a 2 a I I...a. 75 so 25 FR6 B1 3 5 7 9 11 13 FR6 AP.... d,% FR 6 9 % II I... 15 B1 3 5 7 9 11 13 15 B1 3 5 7 9 11 13 15 Test Days. FRJ.--1 135 7 9 11 13 15 B 1 3 5 7 9 11 13 15 Fig. 5. Percentage of total test responses occurring to S+ over the 15 days of discrimination training. Open circles are for the first exposure to the schedule, filled circles are for the second exposure.

458 MICHAEL D. ZEILER For Bird 4, the flattest gradients occurred with Fl, and for Bird 46, the flattest gradients were with VR. Comparisons among schedules were facilitated by computing the per cent of total test responses made to S+ on each day (Fig. 5). For each bird, a 4 x 15 analysis of variance conducted on the single estimates of the degree of control by S+ compared the different schedules and took into account the variability between the first and second exposures. One factor was the four schedules, the other was the 15 days of training, and the error term was the sum of squares for the first and second exposures. As the interaction between schedules and days was not statistically significant, the general shapes of the acquisition curves across days appeared to be similar for the different schedules. For all birds the difference between schedules and the difference between days of training was significant (for both, p <.1). The source of the days' effect was obvious from Fig. 5: the percentage of responses to S+ increased with successive days of training. Schedule effects were studied by paired comparisons. Bird 1 revealed less control by S+ with DRL than any other schedule, Bird 42 revealed less control by S+ with DRL than with FR, VR, and Fl, and more control with FR and VR than with Fl, Bird 4 had more control by S+ with FR than with either VI or VR and greater control with Fl than with VI, and Bird 46 had more control by S+ with FR than with any other schedule. The consistent results were that DRL always produced the least control by S+, and FR always resulted in at least as much and often more control than did the other schedules. Response Rates During Phases I and 2 The rate of responding when reinforcement was available changed with the onset of discrimination training (Phase 2). Figure 6 shows response rates during Phase 1, to the vertical line during the reinforced intervals on each day of discriminationt training, and to all stimuli during the extinction periods of Phase 2. The data shown are for the last schedule studied with each bird and are representative of those observed with the first use of each schedule. Particular types of schedules produced similar behavior in all birds. The rates of responding during the periods of reinforced responding in Phase 2 were higher than in Phase 1; this increase occurred at the same time that the rate decreased during the extinction (test) periods. The only schedule not shown, DRL, produced similar rate changes: for Bird 1, the rate increased from.13 responses per second in Phase 1 to.17 responses per second in the reinforcement intervals of Phase 2, and for Bird 42, rate rose from.21 to.36 responses per second. For both birds, the rate in extinction fell to.3 responses per second. With Fl, VI, and VR schedules, the rate during reinforcement periods was higher than that of Phase 1 throughout discrimination training; with FR and DRL, the rate declined to the Phase 1 level after reaching a peak during the first 1 days of training. V 4) () c- 4). ) Co 4) 15 ta.5 LU >. \ ~~~~Bird 46 VR 6 ' I I I I I 7 9 111 15 Bird 4.5 F12 '1 3 S 7911ir13 15 1.5 ki l. o Reinforcement V 2 1,r - \ e Extinction l.5 ;I. I I -w -u- I 11 I I I I I I I I I 1 3 S 7 9 11 13 15 Test Day Fig. 6. Rate of responding during the periods of each day of discrimination training when reinforcement was available (Reinforcement) and when reinforcement was not available (Extinction). The first point shows the average response rate at the end of Phase 1 when reinforcement was always available: The rates for the last five days of Phase 1 were within.1 response per second of the value depicted.

SCHEDULE AND STIMULUS CONTROL 459 T V12 FR6O ~~~~~~BIRDi VIi. FR6O BIRD 4 DRL16 Fl 5' VR75 - J~~~~~~~~~~~~~~~~~~~~~J F12' ID VRW FR 6 DRL1O BIRD 42 VR8O VI 1 FROO BIRD 46 Fl 2' Fl 5 6 Minutes Fig. 7. Cumulative records for the five test intervals of the tenth day of discrimination training. Intervals are stacked in order with the first interval at the top. Data are for the second exposure to each schedule. The rate of responding in extinction was related both to the stimulus appearing on the key and the point in time of the appearance. Cumulative records for the extinction periods of the tenth day of discrimination training with the second application of each schedule (Fig. 7) show the tendency for the rate of responding to be positively accelerated. Extinction began either with a pause in responding or with a burst of responses followed by a pause, and then responding increased. The positive acceleration appeared by the second day and was maintained despite the decreases in the number of responses over the course of discrimination training, and over the five periods of each day. The positive acceleration during extinction was unrelated to the patterns of responding when food was available for responses. Fixedinterval schedules produced a post-reinforcement pause followed by positively accelerated responding, FR produced a post-reinforcement pause followed by an abrupt shift to responding at a high rate, VR and VI produced a moderate to high steady rate of responding, and DRL produced a low rate of responding that was either steady or slightly negatively accelerated. These patterns are typical of those observed with the same schedules of reinforcement when pigeons are exposed to a single stimulus (Ferster and Skinner, 1957). DISCUSSION The period of reinforcement for responses to all of the stimuli eliminated stimulus control, and discrimination training with the vertical line as S+ and all of the stimuli as S- changed this baseline. Although gradients based on absolute numbers of responses were not recovered, relative gradients were similar with each application of a schedule. The occurrence of significant schedule effects based on the relative measure revealed that the procedure was sufficiently sensitive to permit studying variables influencing gradients of stimulus control. The principal schedule effect was that DRL produced the least control by S+. When these results are taken together with those of Hearst, Koresko, and Poppen (1964), they show that DRL generates less stimulus control than other schedules in first discriminations for naive pigeons and in repeated discriminations for pigeons given the same stimuli under a variety of schedules. The other consistency was that FR produced as steep or steeper gradients than FI, VI, VR, and DRL. This observation fits with several others showing that stimuli correlated with reinforcement on a fixed ratio rapidly obtain a high degree of control over behavior (Dews, 1963; Ferster, 196; Morse, 1955; Zeiler, 1968).

46 4MICHAEL D. ZEILER Although other values of the schedules used might have produced different gradients, experiments that have manipulated schedule size in simple simultaneous or successive discriminations suggest that schedules are inconsistent in this respect. Stimulus control decreased with increases in the mean value of variable-interval schedules (Hearst et al., 1965), but did not change with the size of fixed-ratio schedules (Zeiler, 1968). Comparisons of these two studies are difficult, however, because the first studied different animals at each VI value, and the second used the same birds under all ratios. The considerable similarity in gradients under VI 1-min and VI 2-min schedules and under Fl 2-min and Fl 5-min schedules in the present study suggested a minimal role of interval size in birds given repeated training and testing. The effects of various schedules do not appear to be integrated by reference to frequency of reinforcement (reinforcements per second), probability of reinforcement (reinforcements per response), or rate of responding (responses per second). The order of schedules, from highest to lowest, in frequency of reinforcement was DRL, FR, VR, VI, FI; in probability of reinforcement, the order was DRL, FR and VR, Fl, VI; in average rate of responding, the order was FR, VI or VR depending on particular values, Fl, and DRL. Thus, FR and DRL were most alike in frequency and probability of reinforcement, but produced the largest differences in stimulus control. In terms of response rate, DRL produced both a much lower rate and less control by S+ than did FR, VR, Fl, or VI. However, although FI, the schedule closest in response rate to DRL, produced less control than did FR, VR, or VI for some birds, FI produced a high level of control by S+ together with a low response rate in others. Hearst's (1965) detailed review also showed that schedule influences on stimulus control cannot be attributed to differences in average response rate. The enhanced rate of responding maintained by all schedules during discrimination training is an example of behavioral contrast. Behavioral contrast has been demonstrated whenever one stimulus is correlated with reinforced responding and another stimulus is correlated with the absence of reinforcement and the stimuli appear successively (see the review and data of Reynolds and Catania, 1961, for descriptions of contrast obtained with the types of schedules studied in the present experiment). That Fl, VI, and VR schedules produced contrast throughout the discrimination training phase, whereas FR and DRL had more transitory effects, suggests the influence of particular reinforcement schedules on maintained enhanced responding. Although both discrimination and contrast depend on differential reinforcement and on particular reinforcement schedules, the same schedule can influence the two aspects of behavior differently. This possibility is illustrated by the observation that FR and DRL, which produced clear differences in stimulus control, had similar influences on contrast. In agreement with Ferster's (196) finding in experiments involving matching-to-sample, characteristic patterns of responding recurred with each application of a schedule despite the frequency of stimulus changes during acquisition. Another regularity was the pattern of responding during extinction: responding was positively accelerated over the wide range of response rates generated by the different schedules. Similar patterns of responding in the presence of a stimulus correlated with extinction when it alternated with one correlated with reinforcement were reported by Morse (1955). The present experiments did not show whether repeated nondifferential reinforcement with all stimuli followed by discrimination training will yield conclusions similar to those of procedures not involving reinforced responding to all stimuli, or about the type of problem for which the advantages of studying variables in the same individual must be sacrificed in favor of experimental naivete. The literature does suggest, however, that a variety of problems in stimulus control are amenable to study with single organisms. Reynolds (1961) investigated reinforcement and extinction gradients and Blough (1961) was able to study gradients produced at various locations on the wavelength continuum in the same subjects; Lyons and Thomas (1967) reported that the extent of control by S+ could be changed and recovered by varying the type of training with a stimulus lying on another dimension; Sidman and Rosenberger (1967) and Boren and Devine (1968) were able to use the same subjects to explore and replicate an

SCHEDULE AND STIMULUS CONTROL 461 analysis of the significant variables controlling sequential spatial discriminations. These promising results suggest that there may be nothing inherent in the problems of stimulus control to prevent studying the same individual repeatedly. REFERENCES Blough, D. S. The shape of some wavelength generalization gradients. Journal of the Experimental Analysis of Behavior, 1961, 4, 31-4. Boren, J. J. and Devine, D. D. The repeated acquisition of behavioral chains. Journal of the Experimental Analysis of Behavior, 1968, 11, 651-66. Dews, P. B. Behavioral effects of drugs. In S. M. Farber and R. H. L. Wilson (Eds.), Conflict and creativity. New York: McGraw-Hill, 1963. Pp. 138-153. Ferster, C. B. Intermittent reinforcement of matching to sample in the Pigeon. Journal of the Experimental Analysis of Behavior, 196, 3, 259-272. Ferster, C. B. and Skinner, B. F. Schedules of reinforcement. New York: Appleton-Century-Crofts, 1957. Guttman, N. and Kalish, H. I. Discriminability and stimulus generalization. Journal of Experimental Psychology, 1956, 51, 79-88. Hearst, E. Approach, avoidance, and stimulus generalization. In D. I. Mostofsky (Ed.), Stimulus generalization. Stanford: Stanford Univ. Press, 1965. Pp. 331-355. Hearst, E., Koresko, M. B., and Poppen, R. Stimulus generalization and the response-reinforcement contingency. Journal of the Experimental Analysis of Behavior, 1964, 7, 369-38. Kalish, H. I. and Haber, A. Generalization: I. Generalization gradients from single and multiple stimulus points. II. Generalization of inhibition. Journal of Experimental Psychology, 1963, 65, 182-189. Lyons, J. and Thomas, D. R. Effects of interdimensional training on stimulus generalization: II. Within-subjects design. Journal of Experimental Psychology, 1967, 75, 572-574. Morse, W. H. An analysis of responding in the presence of a stimulus correlated with periods of nonreinforcement. Unpublished doctoral dissertation, Harvard University. Reynolds, G. S. Contrast, generalization, and the process of discrimination. Journal of the Experimental Analysis of Behavior, 1961, 4, 289-294. Reynolds, G. S. and Catania, A. C. Behavioral contrast with fixed-interval and low-rate reinforcement. Journal of the Experimental Analysis of Behavior, 1961, 4, 387-391. Sidman, M. and Rosenberger, P. B. Several methods for teaching serial position sequences to monkeys. Journal of the Experimental Analysis of Behavior, 1967, 1, 467-478. Thomas, D. R. and Switalski, R. W. A comparison of stimulus generalization following variable ratio and variable interval training. Journal of Experimental Psychology, 1966, 71, 236-24. Zeiler, M. D. Stimulus control with fixed-ratio reinforcement. Journal of the Experimental Analysis of Behavior, 1968, 11, 17-115. Received 29 July 1968.