Transcript Slide 1
MEN ACT UPON THE WORLD,
AND CHANGE IT, AND ARE
CHANGED IN TURN BY THE
CONSEQUENCES OF THEIR
ACTION.
B. F. Skinner, Verbal Behavior, 1957, p. 1
SHAPING CONTINGENCIES
P (SR | Ro) versus P (SR | ~Ro)
SR: reinforcing consequence
Ro: selected operant class
Any current Ro changes toward a target operant class
(Ro target) with differential consequences—i.e., selection.
When the environment is held constant, we have a twoterm contingency. The process is a dynamic interplay
between selected aspects of current behavior and the
reinforcing consequence to yield novel behavior.
Operant Conditioning
Operant Conditioning
Operant Conditioning
Operant Conditioning
Three-term Contingency
SD
SD: discriminative
stimulus
SR
Ro
Ro: operant class
SR: reinforcer
Newton’s Second Law of
Motion
F m
dv
dt
F: force
m: mass
dv/dt: acceleration
Operant Conditioning
PATTERNS OF BEHAVIOR
• “The outstanding characteristic of
operant behavior is that it can be
differentiated in form and in temporal
patterning by consequent events.”
• Morse, 1966
PATTERNS OF BEHAVIOR
• Behavior is very, very complex---but:
• Behavior is not chaotic or without causes.
• This means there can be a science of
behavior.
• There are orderly patterns to behavior.
• These patterns are induced, in part, by
contingencies.
CUMULATIVE STUDY TIME FOR TEST
T
Operant Conditioning
W
Th
F
S
Su
M
T
Operant Conditioning
Fixed-Interval (FI)1-min.
SIGNIFICANCE OF SCHEDULES
A. CONCEPT ESSENTIAL TO BEHAVIOR ANALYSIS
B. DEFINITION: Arrangements for initiating and terminating stimuli in time
and in relation to responses.
C. Most consistent, reliable and powerful method for the generation and
analysis of patterns of behavior. “Galilean” flavor of theory and investigation.
D. The operation of schedules have enormous generality—species, response
class, and maintaining events.
E. The richest source of understanding the concepts of response units,
stimulus control, reinforcement, and punishment, as well as for the generation
of quantitative models of behavior.
F. Concepts of motivation directly tied and certainly enriched by schedule effects.
G. Schedule performances serve as dependent variables for the analysis of
independent variables such as drugs, toxicants, and other environmental actions—
not merely as baselines, but as factors manifesting drug effects themselves.
SCHEDULE CLASSIFICATION
• I. Response independent (FT t and VT t)
• II. Response dependent
a) responses only (e.g. FR n and VR n)
b) responses and time (e.g., FI t and VI t)
c) differentiation (e.g., shaping, IRT > t,
DRO t).
CONTROLLING VARIABLES
• I. Direct or Intrinsic (e.g., n in ratio
schedules, t in interval schedules).
• II. Indirect ( time in ratio schedules,
response number in interval schedules,
IRTs in both).
• III. Stereotypic versus Dynamic Effects.
Fixed-Interval (FI)1-min.
Operant Conditioning
Operant Conditioning
Operant Conditioning
Operant Conditioning
Operant Conditioning
Operant Conditioning
O-rules,
functional relations
B = f (r)
r:
B:
feedback
output
E-rules,
feedback functions
r = g (B)
Figure 1. The behavior-environment feedback system
Operant Conditioning
MOLAR FEEDBACK FUNCTIONS:
WHAT IS THE RELATION BETWEEN
RESPONSE RATE UNDER SOME
CONTINGENCY AND REINFORCEMENT
FREQUENCY?
“MOLAR” MEANS SOMETHING LIKE “IN THE
LONG RUN”, BUT WITHOUT BEING VERY
SPECIFIC OF WHAT “LONG-RUN” MEANS.
Operant Conditioning
WHAT IS THE RELATIONSHIP
BETWEEN O-RULES AND E-RULES?
CAN WE PREDICT O-RULES FROM
E-RULES?
IN GENERAL, THE RELATIONS ARE
NOT SIMPLE.
Ratio-Like Feedback Functions
r mg ( B); m 1/ n
r kB
p
r aB bB c
2
Baum’s (1992) Proposed VI t Feedback
Function
B a (1 a)e
r
cB
Bt a (1 a)e 1
cB
OTHER POSSIBILITIES
r A sin (kB)
2
dB
r g
dt
Tav K ( IRTav)(1 IRTav)
IN VERY FEW CASES, SPECIFIC ANALYTIC
FEEDBACK FUNTIONS FOR THE
“COMMON” SCHEDULES HAVE NOT BEEN
SPECIFIED. IN FACT, THIS TASK CAN BE
VERY CHALLENGING. IN SOME FEW
CASES WE CAN GUESS WHAT THE
GENERAL FORM OF THE FUNCTION
MIGHT BE.
EXAMPLE: THE VARIABLE-INTERVAL
SCHEDULE.
Operant Conditioning
Operant Conditioning
Feedback Functions
Fixed or Random Ratio ( FR n or RR n)
r = mB r = reinforcer rate, B = response rate,
m = reinforcers, response = 1/n
Variable or Random Interval ( VI t or RI t)
r = B [ a + (1-a) exp (-cB)] / (tB [a + (1-a) exp (-cB)] + 1)
Baum, 1992
Operant Conditioning
PROBLEM: HOW DOES BEHAVIOR
COME UNDER CONTROL OF A
MOLAR FEEDBACK FUNCTION?
THIS QUESTION RAISES THE
ISSUE OF MOLAR vs. MOLECULAR
ANALYSIS. MOLAR CONTROL MUST
EMERGE FROM SHAPING AT THE
MOLECULAR LEVEL. HOW?
HOW DO VARIABLE-RATIOS (VR n) AND VARIABLEINTERVALS (VI t) COMPARE?
“Variable-ratios control a higher rate than
variable-intervals.”
WHY IS THIS STATEMENT, AT BEST,
IMPRECISE AND, AT WORST, OUTRIGHT
WRONG?
How can we make the statement precise?
What is an essential control condition?
ONE POSSIBLE WAY IS THROUGH
DIFFERENTIAL REINFORCEMENT
OF INTER-RESPONSE TIMES (IRTs).
FOR EXAMPLE, RATIO SCHEDULES
FAVOR SHORT IRTs. WHY? INTERVAL
SCHEDULES FAVOR LONGER IRTs.
WHY?
THE TWO CONTINGENCIES MUST PROVIDE
EQUAL REINFORCEMENT
FREQUENCY!
THUS THE PROPER STATEMENT IS: “GIVEN
EQUAL OBTAINED REINFORCEMENT
FREQUENCY, VARIABLE-RATIO SCHEDULES
WILL CONTROL A HIGHER RATE OF
RESPONDING THAN VARIABLE-INTERVAL
SCHEDULES.”
BUT HOW DO WE ARRANGE THIS?
QUESTION
IF INSTEAD OF COMPARING THE TWO
SCHEDULES ON THE BASIS OF EQUAL
REINFORCEMENT FREQUENCY, WE WANTED
TO COMPARE ON THE BASIS OF EQUAL
RESPONSE REQUIREMENTS (i.e., the number
of responses/reinforcer), HOW WOULD WE DO
THAT?