Open Document

Download Report

Transcript Open Document

Operant Conditioning
Operant Conditioning
 Operant Conditioning – A form of learning in which voluntary
responses come to be controlled by their consequences.
 What does this mean?
 Founded by B.F. Skinner
How did Skinner Study Operant
Conditioning?
 Skinner Box – Small enclosure in which an animal can make a
specific response that is systematically recorded while consequences
of response are controlled
 How does the device work?
Operant Conditioning Principles
 Reinforcement – Occurs when the event following a response
increases an organisms tendency to make a response
 ie.) response strengthened because of what follows
 Examples
 Punishment – Occurs when an event following a response weakens
an organisms tendency to make response
 ie.) response weakened because of what follows
 Examples
Types of Reinforcement/Punishment
 1.) Positive Reinforcement
 2.) Negative Reinforcement
 3.) Positive Punishment
 4.) Negative Punishment
How to remember:
 Reinforcement- behavior increases
 Punishment- behavior decreases
 Positive- something is added
 Negative- Something is removed
1.) Positive Reinforcement
 A response is strengthened because it is followed by a rewarding
stimulus
 In skinner box, rats press lever more if they are rewarded with food
 Real World Examples:
2.) Negative Reinforcement
 A response is strengthened because it is followed by the removal of
an unpleasant stimulus
 In skinner box, rat would press lever more to avoid a shock
 Real World Examples:
3.) Positive Punishment
 A response is weakened because it is followed by an unfavorable
stimulus
 In Skinner box, rats would stop pressing lever if it was followed by a
shock
 Real World Examples:
4.) Negative Punishment
 A response is weakened because it is followed by the removal
of a pleasant stimulus
 Equate this with “Time out”
 Children stop acting out because their toys are removed
 Other Real World Examples:
How can Operant Procedures be used?
 Shaping – The reinforcement of closer and closer
approximations of a desired response
 Family guy clip:
http://www.youtube.com/watch?v=prvSM8TlIeI
Other examples:
Extinction
 Extinction – This term refers to the gradual weakening and
disappearance of a response when it is no longer followed by
reinforcement
 In Skinner box, rats would stop pressing lever if food was never
delivered
 Real World Examples:
Schedules of Reinforcement
-Pattern of presentation of reinforcers over time
 Continuous Reinforcement – Every Instance of a response is
reinforced
 Ex.) every time rat presses lever, he receives food
 Intermittent Reinforcement- A response is only reinforced
some of the time.
 4 Types:
 1. Fixed Ratio
 2. Variable Ratio
 3. Fixed Interval
 4. Variable Interval
4 Types of Intermittent Reinforcement
 1. Fixed Ratio
 2. Variable Ratio
 3. Fixed Interval
 4. Variable Interval
 Terminology Breakdown
 Fixed – Reinforcement occurs after set # (of responses or hours)
 Variable- Reinforcement occurs after a varied # (of responses or
hours)
 Ratio – Based on responses (# of times hit lever, for example)
 Interval – Based on time (# of hours passed, for example)
1.) Fixed Ratio
 A reinforcer is given after a set (or fixed) number of
responses
 Examples:
 Rat receives food every 10th lever press
2.) Variable Ratio
 A reinforcer is given after varied number of responses
 Examples:
 Rat gets food, on average, every 10th lever press
3.) Fixed Interval
 A reinforcer is given after a set (or fixed) time interval
 Examples:
 Rat given food for lever press every 2 minutes
4.) Variable Interval
 A reinforcer is given after a varied amount of time passes
 Examples: rat given food, on average, every 2 minutes.
 Ratio Schedules = more rapid responding
 Why?
 Variable Schedules = greater resistance to extinction
 Why?