ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE

Similar documents
FIXED-RATIO PUNISHMENT1 N. H. AZRIN,2 W. C. HOLZ,2 AND D. F. HAKE3

PROBABILITY OF SHOCK IN THE PRESENCE AND ABSENCE OF CS IN FEAR CONDITIONING 1

CAROL 0. ECKERMAN UNIVERSITY OF NORTH CAROLINA. in which stimulus control developed was studied; of subjects differing in the probability value

AMOUNT OF RESPONSE-PRODUCED CHANGE IN THE CS AND AVOIDANCE LEARNING 1

Supporting Online Material for

REINFORCEMENT OF PROBE RESPONSES AND ACQUISITION OF STIMULUS CONTROL IN FADING PROCEDURES

REPEATED MEASUREMENTS OF REINFORCEMENT SCHEDULE EFFECTS ON GRADIENTS OF STIMULUS CONTROL' MICHAEL D. ZEILER

Stimulus control of foodcup approach following fixed ratio reinforcement*

CONDITIONED REINFORCEMENT IN RATS'

Instrumental Conditioning I

postreinforcement pause for a minute or two at the beginning of the session. No reduction

Value transfer in a simultaneous discrimination by pigeons: The value of the S + is not specific to the simultaneous discrimination context

Some Parameters of the Second-Order Conditioning of Fear in Rats

CRF or an Fl 5 min schedule. They found no. of S presentation. Although more responses. might occur under an Fl 5 min than under a

Subjects and apparatus

CS DURATION' UNIVERSITY OF CHICAGO. in response suppression (Meltzer and Brahlek, with bananas. MH to S. P. Grossman. The authors wish to

DISCRIMINATION IN RATS OSAKA CITY UNIVERSITY. to emit the response in question. Within this. in the way of presenting the enabling stimulus.

Increasing the persistence of a heterogeneous behavior chain: Studies of extinction in a rat model of search behavior of working dogs

GENERALIZATION GRADIENTS AS INDICANTS OF LEARNING AND RETENTION OF A RECOGNITION TASK 1

UNIVERSITY OF WALES SWANSEA AND WEST VIRGINIA UNIVERSITY

STIMULUS FUNCTIONS IN TOKEN-REINFORCEMENT SCHEDULES CHRISTOPHER E. BULLOCK

The generality of within-session patterns of responding: Rate of reinforcement and session length

Some Effects of Discrimination Training on a Line Length Dimension

Attention shifts during matching-to-sample performance in pigeons

KEY PECKING IN PIGEONS PRODUCED BY PAIRING KEYLIGHT WITH INACCESSIBLE GRAIN'

Effects of Repeated Acquisitions and Extinctions on Response Rate and Pattern

NSCI 324 Systems Neuroscience

PREFERENCE REVERSALS WITH FOOD AND WATER REINFORCERS IN RATS LEONARD GREEN AND SARA J. ESTLE V /V (A /A )(D /D ), (1)

INTRODUCING NEW STIMULI IN FADING

Behavioural Processes

The effects of Pavlovian CSs on two food-reinforced baselineswith and without noncontingent shock

that simple discrimination training to compound given a set to react -to one aspect of a stimulus a "set", a discrimination is "pretrained" along

MCMASTER UNIVERSITY. more values independently of food delivery. In any case, the well known result of Skinner's

Jennifer J. McComas and Ellie C. Hartman. Angel Jimenez

PSYC2010: Brain and Behaviour

Cooling as Reinforcing Stimulus in Aplysia

Representations of single and compound stimuli in negative and positive patterning

Operant response topographies of rats receiving food or water reinforcers on FR or FI reinforcement schedules

Observing behavior: Redundant stimuli and time since information

Contrast and the justification of effort

Within-event learning contributes to value transfer in simultaneous instrumental discriminations by pigeons

Excerpt from LABORATORY MANUAL PRINCIPLES OF PSYCHOLOGY: EXPERIMENTAL FOUNDATIONS PSYCHOLOGY

Reinforcer Magnitude and Resistance to Change of Forgetting Functions and Response Rates

Behavioral Contrast: A New Solution to an Old Problem

Transfer of persistence across reinforced behaviors

PURSUING THE PAVLOVIAN CONTRIBUTIONS TO INDUCTION IN RATS RESPONDING FOR 1% SUCROSE REINFORCEMENT

CONTINGENT MAGNITUDE OF REWARD IN A HUMAN OPERANT IRT>15-S-LH SCHEDULE. LOUIS G. LIPPMAN and LYLE E. LERITZ Western Washington University

TESTING THE SCALAR PROPERTY WITH INTERVAL REPRODUCTIONS

OBSERVING RESPONSES AND SERIAL STIMULI: SEARCHING FOR THE REINFORCING PROPERTIES OF THE S2 ROGELIO ESCOBAR AND CARLOS A. BRUNER

Magazine approach during a signal for food depends on Pavlovian, not instrumental, conditioning.

Motivational cues as determiners of stimulus control in rats

ANNA STATE HOSPITAL. peated measurement of the attack. A mechanical

Classical Conditioning Classical Conditioning - a type of learning in which one learns to link two stimuli and anticipate events.

Interference and auditory short-term memory in the bottlenosed dolphin*

on both components of conc Fl Fl schedules, c and a were again less than 1.0. FI schedule when these were arranged concurrently.

Travel Distance and Stimulus Duration on Observing Responses by Rats

SECOND-ORDER SCHEDULES: BRIEF SHOCK AT THE COMPLETION OF EACH COMPONENT'

CROSSMODAL PLASTICITY IN SPECIFIC AUDITORY CORTICES UNDERLIES VISUAL COMPENSATIONS IN THE DEAF "

Schedule Induced Polydipsia: Effects of Inter-Food Interval on Access to Water as a Reinforcer

Stimulus and Temporal Cues in Classical Conditioning

Contextual Control of Chained Instrumental Behaviors

REACTION TIME AS A MEASURE OF INTERSENSORY FACILITATION l

Effectiveness of three differential reinforcement techniques as a function of past reinforcement history and present schedule of reinforcement

Behavioural Processes

Functionality. A Case For Teaching Functional Skills 4/8/17. Teaching skills that make sense

DOES THE TEMPORAL PLACEMENT OF FOOD-PELLET REINFORCEMENT ALTER INDUCTION WHEN RATS RESPOND ON A THREE-COMPONENT MULTIPLE SCHEDULE?

Value Transfer in a Simultaneous Discrimination Appears to Result From Within-Event Pavlovian Conditioning

Transfer of Control in Ambiguous Discriminations

What is Learned? Lecture 9

Interval duration effects on blocking in appetitive conditioning

Chapter 7 - Learning

Spontaneous recovery. Module 18. Processes of Conditioning. Classical Conditioning (cont d)

Feature extinction enhances transfer of occasion setting

Amount of training effects in representationmediated food aversion learning: No evidence of a role for associability changes

Investigating the Effects of Sensory Learning in Rats Using Intra and Extra Stimulus Modalities

PSY402 Theories of Learning. Chapter 6 Appetitive Conditioning

Representation of sound in the auditory nerve

Birds' Judgments of Number and Quantity

Schedules of Reinforcement

A concurrent assessment of the positive and negative properties of a signaled shock schedule*

Operant Conditioning B.F. SKINNER

Occasion Setting without Feature-Positive Discrimination Training

THE EFFECTS OF TERMINAL-LINK STIMULUS ARRANGEMENTS ON PREFERENCE IN CONCURRENT CHAINS. LAUREL COLTON and JAY MOORE University of Wisconsin-Milwaukee

Human Schedule Performance with Hypothetical Monetary Reinforcement

DIFFERENTIAL REINFORCEMENT OF OTHER BEHAVIOR AND RESPONSE SUPPRESSION:

Underwater hearing in California sea lions (Zalophus californianus): Expansion and interpretation of existing data

INDUCED POLYDIPSIA1 JOHN L. FALK UNIVERSITY OF MICHIGAN. a large number of these small meals. Under. CRF, inter-reinforcement time is short and

Effects of a Novel Fentanyl Derivative on Drug Discrimination and Learning in Rhesus Monkeys

Sequential processes in the generalization and transfer ofstimulus control

Measures of temporal discrimination in fixedinterval performance: A case study in archiving data

Ivan Petrovich Pavlov ( )

Classical and Instrumental Conditioning. Lecture 8

Some determinants of second-order conditioning

Effects of lesions of the nucleus accumbens core and shell on response-specific Pavlovian i n s t ru mental transfer

STUDIES OF WHEEL-RUNNING REINFORCEMENT: PARAMETERS OF HERRNSTEIN S (1970) RESPONSE-STRENGTH EQUATION VARY WITH SCHEDULE ORDER TERRY W.

BEHAVIOR IN THE ALBINO RAT'

Contextual Effects in Conditioning, Latent Inhibition, and Habituation: Associative and Retrieval Functions of Contextual Cues

Signaled reinforcement effects on fixed-interval performance of rats with lever depressing or releasing as a target response 1

Are the Referents Remembered in Temporal Bisection?

E-01 Use interventions based on manipulation of antecedents, such as motivating operations and discriminative stimuli.

Transcription:

JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR ANTECEDENT REINFORCEMENT CONTINGENCIES IN THE STIMULUS CONTROL OF AN A UDITORY DISCRIMINA TION' ROSEMARY PIERREL AND SCOT BLUE BROWN UNIVERSITY 1967, 10, 545-550 NUMBER 6 (NOVEMBER) In order to assess possible confounding of discriminative stimulus effects with those produced by the reinforcing stimulus, three groups of four rats each were trained for 45 hr on a variable-interval 1-min reinforcement program. Two groups were run on a multiple variableinterval extinction schedule in which the reinforcement stimulus (SD) and the nonreinforcement stimulus (SI) were two intensities of a 4-kHz (cps) tone separated by 40 or 10 db. The third group was run on a mixed schedule with a single intensity constantly present. The mixedschedule animals showed no discrimination of the reinforcement program. Under the multiple schedule, the highest SI rates were obtained after SD intervals, regardless of the reinforcement availability in the SD interval. These local rate variations in SA were small in proportion to those produiced by the SD versus SI intensities. Jenkins (1965) has criticized much of the previous operant discrimination work on the basis that the effects produced by the discriminative stimulus are confounded with effects produced by the stimulus used as a reinforcer. The problem of parceling out various aspects of the stimulus control of responding is not a new one (Pavlov, 1927). In studies of generalization an(i discrimination the interest is in assessing the extent to which variations along a specific stimulus continuum produce co-variations in rate of responding (Dinsmoor, 1950; Guttman and Kalish, 1956; Pierrel and Sherman, 1960). It is well established that the occurrence or non-occurrence of reinforcement produces local variations in rate (Ferster and Skinner, 1957). Jenkins suggests, however, that the presentation of a reinforcing stimulus produces an immediate and major effect upon rate. If such local variations are assumed to be comparable in magnitude to those effects resulting from varying the discriminative stimulus continuum, it is difficult to assess continuum effects without separating them in the data analysis from effects due to the reinforcement schedule. Jenkins describes a discrete trial procedure in which the effects of antecedent reinforcement and nonreinforce- 'This research was conducted as a part of a research program being carried out under United States Public Health Service Grant HD-00928. Reprints may be obtained from Rosemary Pierrel, Walter S. Hunter Laboratory of Psychology, Brown University, Providence, Rhode Island 02912. ment are analyzed separately from the effects of the current external stimulus. In his procedure, the trial (and responding) is terminated when reinforcement is delivered. Comparison trials of equal duration are terminated without reinforcement. His data, obtained from pigeons trained on a visual discrimination, show that response probability is related not only to the visual stimulus but also to reinforcement or nonreinforcement on the immediately preceding trial. Boneau, Holland, and Baker (1965) collected wavelength generalization gradients for pigeons, separating responses made just before and just after reinforcement. While they found some post-reward effect, the gradient being displaced upwards when responses immediately following reinforcement were isolated, the general shape of the gradient was largely unaffected by the reinforcement-nonreinforcement contingency. There were essentially no differences among response probabilities measured one or two trials before reinforcement, and two trials after reinforcement. An alternative to adopting a trial procedure to assess the confounding of stimulus continuum and reinforcement effects, is to attempt to analyze and evaluate the extent to which reinforcement delivery controls response rate in an operant discrimination situation. Such an inquiry adds to the information concerning multiple stimulus control and permits continuing contact with the con- 545

546 ROSEMARY PIERREL and SCOT BLUE siderable body of discrimination-generalization data measured in terms of response rate. The present experiment utilized an operant discrimination paradigm in which an "easy" and a "difficult" two-valued auditory discrimination was trained. A control group was run on a mixed schedule. Intensity effects, as related to the presence and absence of immediately antecedent reinforcement, were analyzed. METHOD Subjects Twelve male rats of a Sprague-Dawley derived strain, obtained from Carworth, Inc., were about 100 days old at the start of experimentation. They were reduced to 80% of their free-feeding weights, and were maintained at approximately this level by the food reinforcers received during experimental sessions. Apparatus Four similar sound-shielded operant spaces were used. Each was equipped with a retractable bar, water bottle, pellet dispenser, food dish, and speaker. The animal chamber was constructed so as to produce an acoustically transparent enclosure with a homogenous sound field. Programming and recording equipment were located in an adjacent room. The auditory stimuli were various intensities of a 4-kHz (cps) interrupted tone. The tone was on for 1.5 sec, followed by 2.5 sec of silence. Intensity changes were made only during a silent period. The temporal scheduling of intensity changes, reinforcements, and counter printouts was controlled by a Space Mechanisms Five Channel Photoelectric Programmer (Model SM607). A more detailed description of the enclosures, sound generating, recording, and control equipment is given in Pierrel and Sherman (1960). Procedure The animals were randomly assigned to three groups of four each and placed in an experimental chamber in the presence of the 100-db intensity. This stimulus was to be SD (multiple-schedule groups), or was to be constantly present (mixed-schedule group). All intensities are specified from a 0.0002 dynes/ cm2 reference level. Each rat was bar-trained by being given 50 reinforcements on a fixedinterval 10-sec schedule, immediately succeeded by 75 reinforcements presented on a variable-interval (VI) 1-min schedule. The animals were returned to the enclosures approximately 20 hr after bar-training and remained there for the next 60 hr.2 Experimental sessions lasted 9 hr. At the end of a session the sound went off and the bars were retracted. These conditions were in effect for 3 hr, after which the next session began. Thus, the animals were run 18 out of every 24 hr. The multiple-schedule animals had 16 min exposure to the SD intensity (100 db) and 44 min of the SA intensity (60 or 90 db) during each experimental hour. SD periods ranged from 1 to 3 min and SA periods from 2 to 9 min in duration. Reinforcement in the SD was programmed on a geometric VI schedule with a mean of 1 min. The stimulus intensity sequence and times of reinforcement availability are shown in Table I. This program was repeated every 30 min of the session. A third group was run on a mixed schedule with the 100-db intensity present throughout. Reinforcements were programmed to occur at the same points in time within the mixed schedule as they did during the SD periods of the multiple schedule. Each group is referred to by the two stimuli that were present during the SD and SA periods; i.e., Group 100-60, Table I Stimulus intensity sequence and reinforcement availability schedule. Minute in Session Reinforcement Setup of SD intensity (Sec from interval onset) 1 9,52 2 no reinforcement 3 10 10 30,40 13 no reinforcement 23 15 24 48 28 25 The SA intensity was present during minutes 4-9, 11, 12, 14-22 etc. That is, all minutes in the session not shown above were the SA condition. "Two exceptions to this procedure resulted from difficulties with the equipment. Group 100-60 was removed from the enclosures after 48 hr, fed and replaced 15 hr later for the last session. Group 100-100 was run for an extra 9 hr as a result of a recording and reinforcement delivery problem which occurred between hours 28 and 36.

ANTECEDENT REINFORCEMENT CONTINGENCIES 547 Group 100-90, and Group 100-100. Response totals for any given minute were partitioned into one of six categories (2 for SDs and 4 for SAs) corresponding to the intensity present during that minute and the intensity and reinforcement contingencies in effect in the immediately preceding minute. That is, SD responding was separated into those 1-min intervals which were preceded by either an SD or an SA interval. SA responses were also separated into those following SDs and those following SAs. SA intervals succeeding SD intervals were further subdivided into those following an SD in which one, two, or no reinforcements were made available. Animals in the multiple-schedule groups received from 96 to 100% of the pellets programmed and the mixed-schedule group obtained about 83% of them. Since such a high percentage of the reinforcements were collected, no attempt was made to determine possible rate variations as a function of whether or not each opportunity for reinforcement was in fact fulfilled. RESULTS AND DISCUSSION The histograms in Fig. 1 were constructed to show local rate variations in SD and SA periods as a function of the reinforcement and stimulus conditions in the antecedent interval during an early, middle, and late portion of the experiment. The first two sets of three bars refer to responding in SD intervals preceded by either an SD or an SA interval. The other four sets of bars show responding in SA separated into categories according to the type of preceding interval: an SD with one, two, or no reinforcements, or another SA. For this analysis, each 1 min of session time served as an antecedent interval and as a data point as well. Since the six types of interval categories appeared an unequal number of times during the session, totals were multiplied by appropriate constants to equate them. These histograms represent the mean data for each group of four animals. The session totals for Group 100-60, the multiple-schedule group trained on the 40-db SD-SA difference, show no differences in SD response rate as a function of being preceded by an SD or an SA interval. The number of responses in all SA categories was markedly lower than in the SD categories. The SA rates following reinforced and nonreinforced SD intervals are comparable, and the lowest rates were obtained in SA intervals which followed other SAs. Finally, as training proceeded, the SD rates became higher while rates in all SA categories decreased. The data for Group 100-90 (10-db multipleschedule group) are also presented in Fig. 1. The data are quite similar to those of Group 100-60 except that the SD rates were lower throughout and the SA rates were higher during the early part of training. As in the case of Group 100-60, responding in the two SD categories was essentially the same. Also, the levels of responding in SA intervals preceded by an SD in which no reinforcements were programmed (SRO) are not unlike those measured in SA intervals preceded by a reinforcement SD (SRI or SR2). Again, the smallest number of responses occurred in SA periods which followed other SAs. Taken together, these findings indicate that there is an antecedent interval effect (SA responding is lower when preceded by SA), but this sequential effect results from the recent presence of the intensity which has set the occasion for reinforcement, rather than from the recent occurrence of a reinforcement. Under these experimental conditions, the rate in SA following an SD was about the same whether the preceding SD included the primary reinforcement of a pellet and the presence of the SD intensity, or merely the presence of the SD stimulus intensity alone. An analysis of variance on the data depicted in Fig. 1 for Groups 100-60 and 100-90 statistically confirmed the observations just noted. A significant difference can be demonstrated (F = 10.53, df = 1/6, p <.025) in SA rate preceded by an SD interval as opposed to an antecedent SA interval. There is no difference between the SA intervals which follow the various types of SDS (F = 0.59). No comparable antecedent interval effect appears for SD respond. ing. That is, the rate controlled by an SD following an SA interval is the same as an SD following another SD. The discriminative control of a current SD intensity is apparently enough to override any sequential effects from the preceding interval. The response data for the mixed-schedule animals (Group 100-100) have been partitioned in the same way as for the two multiple schedule groups (Fig. 1). However, for Group

548 ROSEMARY PIERREL and SCOT BLUE MEAN DATA 100-60 The data for a representative animal from each of the three groups are presented in Fig. 2. Aside from individual differences in rate, no effects are noted here which were not evident in the mean data. In order to assess more clearly the development of discriminated responding over the course of training, a Discrimination Index was calculated for each session. The index yields the per cent of responding in SD (Discrimination Index = responses in SD/responses in SD + responses in SA). The SA totals were first multiplied by 0.3636 to correct for the disproportionate amount of the session devoted to S. Figure 3 shows the Discrimination Index curves for each individual animal in each of the three groups. Both multiplez 0 Co Co U 2( eu CO lx z 0 Co EU 4( Ihi 400 HOURS 1-9 a 19-27 a 57-45 a '100-100 2C 200 PRIOR PRESENT SA INTERVAL sd sa SRI SR2 SRA SD CONTINGENCY Fig. 1. Responses per session as a function of present and prior interval contingencies-mean Data. Each bar on each histogram represents the mean response total for the four animals in a group for a given 9-hr training session. Responding in each 1-min interval was subdivided in terms of the intensity and reinforcement conditions prevailing in the immediately preceding 1-min interval. Each 1-min interval served as an antecedent interval and a data point as well. SA. 100-100 the intervals categorized as "SD" and "SA" did not differ in terms of the imposed stimulus intensity. Only the fact that reinforcements were programmed during some "SD" periods distinguished these intervals from periods of "SA". Such a differential stimulus condition might serve a cue function. However, the histograms for Group 100-100 show little in the way of differences among the interval categories, except for an overall reduction in response rate after the first session. That is, without differential stimuli from the imposed intensity continuum, the subjects did not learn to discriminate SD from SA. The presence or absence of prior reinforcement produced no differential responding.

ANTECEDENT REINFORCEMENT CONTINGENCIES 549 INDIVIDUAL DATA z 0 cn Lii 400 100-60 200 HOURS 1-9 * 19-27 U 37-45 U w X. 100-90 foo-100 Cf) 600 Zi z 0(,) w 400 200 L PRIOR S sr1 SRO A PRESENT SDS0 li 600 400 20 SD S SD SRI SR2 'SRO SA I INTERVAL CONTINGENCY Fig. 2. Responses per session as a function of present and prior interval contingencies-individual Data. Each bar on each histogram represents the response total for an individual animal from one of the three groups for a given 9-hr training session. Responding in each 1-min interval was subdivided in tenns of the intensity and reinforcement conditions prevailing in the immediately preceding 1-min interval. schedule groups show increasing Discrimination Indices with continued exposure to the discriminanda. The performance of the 40-db group (100-60) was consistently superior to that of the 10-db group (100-90) as acquisition for these animals proceeded more rapidly. This difference between the "easy" and "difficult" discriminations was not unexpected; rate of acquisition is a direct function of the SD-SA intensity difference (Pierrel and Sherman, 1962). The mixed-schedule group showed little evidence of discrimination. Their Discrimination Indices show no systematic change and approximate chance (0.50) levels of performance throughout the 45 hr of training. A trend analysis (Lewis, 1960, pp. 379 ff.) confirmed this result as no significant training effect was revealed (F = 1.39, df = 4/12, p >.20). The performance of Group 100-100 was obviously inferior to the performance of either multiple-schedule group. The present results agree with those of Jenkins (1965) and Boneau, et al., (1965) in demonstrating some sequential effects. Response probabilities in SA are higher following SD intervals containing reinforcement than in SA intervals following other 3Ps. However, these local effects upon response rate cannot be attributed solely to the presence or absence of reinforcement in the immediately preceding interval, since SDS without programmed reinforcement produce equally high response probabilities in the following SPs. The present study demonstrated that the positive discriminative stimulus controls a high rate in its

550 ROSEMARY PIERREL and SCOT BLUE INDIVIDUAL DATA...... "........40....-~~~~~~10-0... _ ~ ~ ~ ~ ~ E ~ Es' ~ ~ ~ oo- 60 70 18 90 z - ~~~~~~~~~~~100 00- z,6.40 9 1 8 2 7 36 4 5 HOURS OF TRAINING Fig. 3. Percentage of responding in SD (Discrimination Index) as a function of hours of training. Each point on each function represents the D. I. for a 9-hr session for an individual animal. presence and that some of this control carries over into a succeeding SA interval. This effect appeared within the first 9 hr of training and was evident throughout all the discrimination training which followed. Local variations in response rate can be attributed to properties of the stimulus sequence, but it is quite clear that the control exerted by the differential stimulus intensities dwarfs these local effects. REFERENCES Boneau, C. A., Holland, N. K., and Baker, W. M. Color discrimination performance of pigeons. Science, 1965, 149, 1113-1114. Dinsmoor, J. A. A quantitative comparison of the discriminative and reinforcing functions of a stimulus. J. exp. Psychol., 1950, 40, 458-472. Ferster, C. B. and Skinner, B. F. Schedules of reinforcement. New York: Appelton-Century-Crofts, 1957. Guttman, N. and Kalish, H. I. Discriminability and stimulus generalization. J. exp. Psychol., 1956, 51, 79-88. Jenkins, H. M. Measurement of stimulus control during discriminative operant conditioning. Psychol. Bull., 1965, 64, 365-376. Lewis, D. Quantitative methods in psychology. New York: McGraw-Hill, 1960. Pavlov, I. P. Conditioned reflexes. (Trans. G. Anrep) London: Oxford University Press, 1927. Pierrel, R. and Sherman, J. G. Generalization of auditory intensity following discrimination training. J. exp. Anal. Behav., 1960, 3, 313-322. Pierrel, R. and Sherman, J. G. Generalization and discrimination as a function of the SD-SA difference. J. exp. Anal. Behav., 1962, 5, 67-71. Received 24 October 1966.