EXPLORING PSYCHOLOGY (7th Edition in Modules) David Myers
Download
Report
Transcript EXPLORING PSYCHOLOGY (7th Edition in Modules) David Myers
Operant Conditioning
Module 19
1
Learning
Operant Conditioning Overview
Skinner’s Experiments
Extending Skinner’s Understanding
Skinner’s Legacy
Contrasting Classical & Operant
Conditioning
2
Edward L. Thorndike ( 1874–1949)
3
Thorndike’s Puzzle Box link
4
Early Operant Conditioning
• E. L. Thorndike (1898)
• Puzzle boxes and cats
First Trial
in Box
Situation:
stimuli
inside of
puzzle box
Scratch at bars
Push at ceiling
Dig at floor
Howl
Etc.
After Many
Trials in Box
Situation:
stimuli
inside of
puzzle box
Scratch at bars
Push at ceiling
Dig at floor
Howl
Etc.
Etc.
Etc.
Press lever
Press lever
5
Mnemonic
BLOWS
Behaviorism
Learned
Observable
Watson
Skinner
B. F. Skinner (1904–1990)
6
B.F. Skinner
and Operant
Conditioning
• Classical conditioning involves an automatic
response to a stimulus
• Operant conditioning involves learning how to
control one’s response to elicit a reward or
avoid a punishment (to press a lever for
example)
7
General Skinnerian Idea
• “Behavior is a function of its
consequences.”
8
Skinner’s Experiments
Skinner’s experiments extend
Thorndike’s thinking, especially his law
of effect. This law states that rewarded
behavior is likely to occur again.
Yale University Library
9
Operant Conditioning
Operant Behavior
operates (acts) on environment
produces consequences
Respondent Behavior
occurs as an automatic response to
stimulus
behavior learned through classical
conditioning
10
Operant Chamber
Skinner Box
chamber with a
bar or key that an
animal
manipulates to
obtain a food or
water reinforcer
contains devices
to record
responses
11
Operant Chamber
Examples.
Walter Dawn/ Photo Researchers, Inc.
From The Essentials of Conditioning and Learning, 3rd
Edition by Michael P. Domjan, 2005. Used with permission
by Thomson Learning, Wadsworth Division
12
The “Skinner Box”
• Rats placed in
“Skinner boxes”
• Shaped to get
closer and closer to
the bar in order to
receive food
• Eventually required
to press the bar to
receive food
• Food is a reinforcer
13
Shaping
Shaping is the operant conditioning
procedure in which reinforcers guide
behavior towards the desired target
behavior through successive
approximations. link
Fred Bavendam/ Peter Arnold, Inc.
Khamis Ramadhan/ Panapress/ Getty Images
A rat shaped to sniff mines. A manatee shaped to discriminate
objects of different shapes, colors and sizes.
14
15
p. 228
16
p. 228
Types of Reinforcers
Reinforcement: Any
event that
strengthens the
behavior it follows.
Reuters/ Corbis
A heat lamp
positively reinforces
a meerkat’s behavior
in the cold.
17
Types of Reinforcement
• Positive reinforcer (+)
– Adds something
rewarding following a
behavior, making that
behavior more likely to
occur again
– Giving a dog a treat for
fetching a ball is an
example
• Negative reinforcer (-)
– Removes something
unpleasant that was
already in the
environment following
a behavior, making
that behavior more
likely to occur again
– Taking an aspirin to
relieve a headache is
an example
18
19
20
Learned Helplessness
• Failure to try to avoid
an unpleasant
stimulus because in
the past it was
unavoidable
• Possible model for
depression in humans
22
Punishment
An aversive event that decreases the behavior it
follows.
23
Kinds of Reinforcement and
Punishment
Positive +
Negative –
(adding stimulus)
(removing stimulus)
Reinforcement
(label afterwards to
describe increase in
behavior)
Punishment
(label afterwards to
describe decrease in
behavior)
24
Kinds of Reinforcement and
Punishment
Reinforcement
(label afterwards to
describe increase in
behavior)
Positive +
Negative –
(adding stimulus)
(removing stimulus)
Pos. Reinf.
(Adding pleasant
consequence)
Punishment
(label afterwards to
describe decrease in
behavior)
25
Kinds of Reinforcement and
Punishment
Reinforcement
(label afterwards to
describe increase in
behavior)
Positive +
Negative –
(adding stimulus)
(removing stimulus)
Pos. Reinf.
Neg. Reinf.
(Adding pleasant
consequence)
(Removing Aversive
Stimuli)
Punishment
(label afterwards to
describe decrease in
behavior)
26
Kinds of Reinforcement and
Punishment
Reinforcement
(label afterwards to
describe increase in
behavior)
Punishment
(label afterwards to
describe decrease in
behavior)
Positive +
Negative –
(adding stimulus)
(removing stimulus)
Pos. Reinf.
Neg. Reinf.
(Adding pleasant
consequence)
(Removing Aversive
Stimuli)
Pos. Pun.
(Adding aversive stimuli)
27
Kinds of Reinforcement and
Punishment
Reinforcement
(label afterwards to
describe increase in
behavior)
Punishment
(label afterwards to
describe decrease in
behavior)
Positive +
Negative –
(adding stimulus)
(removing stimulus)
Pos. Reinf.
Neg. Reinf.
(Adding pleasant
consequence)
(Removing Aversive
Stimuli)
Pos. Pun.
(Adding aversive stimuli)
Neg. Pun.
(Removing pleasant
stimuli) 28
Negative Reinforcement and
Punishment
Negative reinforcement:
Removing an unpleasant
stimulus
1. Unpleasant stimulus
Punishment
1. Introducing
an unpleasant
stimulus
=
2. Removal of unpleasant stimulus
2. Withholding
a pleasant
stimulus
=
29
30
Figure 6.18 Positive reinforcement versus negative reinforcement
31
Figure 6.20 Comparison of negative reinforcement and punishment
32
IMPORTANT!!
• Negative reinforcement is NOT punishment.
• Negative reinforcement is the REMOVAL of
unpleasant stimulus when target behavior is
observed (a positive consequence of behavior –
increases behavior)
• Punishment is the introduction of an aversive
(unpleasant) stimulus or removal of a pleasant
stimulus as a consequence of behavior – ( a
negative consequence of behavior - decreases
behavior.
33
Primary & Secondary Reinforcers
1. Primary Reinforcer: An innately
reinforcing stimulus like food or
drink. (satisfies a biological need
2. Conditioned (secondary) Reinforcer:
A learned reinforcer that gets its
reinforcing power through association
with the primary reinforcer.
34
Immediate & Delayed Reinforcers
1. Immediate Reinforcer: A reinforcer that
occurs instantly after a behavior. A rat gets
a food pellet for a bar press.
2. Delayed Reinforcer: A reinforcer that is
delayed in time for a certain behavior. A
paycheck that comes at the end of a week.
35
Reinforcement Schedules
1. Continuous Reinforcement: Reinforces the
desired response each time it occurs.
2. Partial (intermittent) Reinforcement:
Reinforces a response only part of the time.
Though this results in slower acquisition in
the beginning, it shows greater resistance to
extinction later on.
36
Schedules of Reinforcement
• Partial reinforcement
lies between
continuous
reinforcement and
extinction
37
Schedules of Reinforcement
Fixed Ratio (FR)
reinforces a
response only after
a specified number
of responses
faster you respond
the more rewards
you get
different ratios
very high rate of
responding
like piecework pay
38
Schedules of Reinforcement
Variable Ratio (VR)
reinforces a
response after an
unpredictable
number of
responses
like gambling, fishing
very hard to
extinguish because
of unpredictability
Skinner link 3:58
SLOT machines show SLOwesT extinction.
39
Schedules of Reinforcement
Fixed Interval (FI)
reinforces a
response only after
a specified (fixed)
time has elapsed
response occurs
more frequently as
the anticipated
time for reward
draws near
40
Schedules of Reinforcement
Variable Interval
(VI)
reinforces a
response at
unpredictable time
intervals
produces slow
steady responding
like pop quiz
41
Intermittent Reinforcement
Schedules Summary
Based on Number of
necessary responses
Predictable
Unpredictable
(“On the
Average”)
Based on Time that must
first pass
Fixed Ratio
(FR)
Fixed Interval
(FI)
Variable Ratio
(VR)
Variable Interval
(VI)
42
Schedules of Reinforcement
43
• You do not have to write down the
following examples.
44
FI, VI, FR, or VR?
1.
2.
3.
4.
5.
When I bake cookies, I can only put one set in at a time,
so after 10 minutes my first set of cookies is done. After
another ten minutes, my second set of cookies is done.
I get to eat a cookie after each set is done baking.
After every 10 math problems that I complete, I allow
myself a 5 minute break.
I look over my notes every night because I never know
how much time will go by before my next pop quiz.
When hunting season comes around, sometimes I’ll
spend all day sitting in the woods waiting to get a shot
at a big buck. It’s worth it though when I get a nice 10
point.
Today in Psychology class we were talking about
Schedules of Reinforcement and everyone was eagerly
raising their hands and participating. Miranda raised her
hand a couple of times and was eventually called on.
1. FI
2. FR
3. VI
4. VI
5. VR
45
FI, VI, FR, or VR?
6. Madison spanks her son if she has to ask him three
times to clean up his room.
7. Emily has a spelling test every Friday. She usually
does well and gets a star sticker.
8. Steve’s a big gambling man. He plays the slot
machines all day hoping for a big win.
9. Snakes get hungry at certain times of the day. They
might watch any number of prey go by before they
decide to strike.
10. Mr. Bertani receives a salary paycheck every 2 weeks.
(Miss Suter doesn’t ).
11. Christina works at a tanning salon. For every 2 bottles
of lotion she sells, she gets 1 dollar in commission.
12. Mike is trying to study for his upcoming Psychology
quiz. He reads five pages, then takes a break. He
resumes reading and takes another break after he has
completed 5 more pages.
6. FR
7. FI
8. VR
9. VI
10. FI
11. FR
12. FR
46
FI, VI, FR, or VR?
13. Megan is fundraising to try to raise money so she can
go on the annual band trip. She goes door to door in
her neighborhood trying to sell popcorn tins. She
eventually sells some.
14. Kylie is a business girl who works in the big city. Her
boss is busy, so he only checks her work periodically.
15. Mark is a lawyer who owns his own practice. His
customers makes payments at irregular times.
16. Jessica is a dental assistant and gets a raise every year
at the same time and never in between.
17. Andrew works at a GM factory and is in charge of
attaching 3 parts. After he gets his parts attached, he
gets some free time before the next car moves down
the line.
18. Brittany is a telemarketer trying to sell life insurance.
After so many calls, someone will eventually buy.
13. VR
14. VI
15. VI
16. FI
17. FR
18. VR
47
Updating Skinner’s Understanding
• Skinner’s emphasis on external control of
behavior made him an influential, but
controversial figure.
• Many psychologists criticized Skinner for
underestimating the importance of
cognitive and biological constraints.
48
Cognitive Approach
Emphasizes
abstract and subtle
learning that could
not be achieved
through
conditioning or
social learning
alone.
49
Cognition & Operant Conditioning
Evidence of cognitive processes during operant
learning comes from rats during a maze
exploration in which they navigate the maze
without an obvious reward. Rats seem to
develop cognitive maps (E.C. Tolman), or mental
representations, of the layout of the maze
(environment).
50
Latent Learning
51
Intrinsic Motivation
Intrinsic Motivation:
The desire to
perform a behavior
for its own sake.
Extrinsic Motivation:
The desire to
perform a behavior
due to promised
rewards or threats of
punishments.
52
Biological Predisposition
Biological constraints
predispose organisms to
learn associations that
are naturally adaptive.
Photo: Bob Bailey
Marian Breland Bailey
53
Skinner’s Legacy
Skinner argued that behaviors were shaped by
external influences instead of inner thoughts and
feelings.
Critics argued that Skinner dehumanized people
by neglecting their free will.
Falk/ Photo Researchers, Inc
.
54
EXPLORING
PSYCHOLOGY
(7th Edition in Modules)
David Myers
PowerPoint Slides
Aneeq Ahmad
Henderson State University,
Amy Jones, Bernstein,
Schallhorn with Garber edits
Worth Publishers, © 2008
58