Make Your Memory Stronger!
Download
Report
Transcript Make Your Memory Stronger!
MEM
RY
SEQUENCE LEARNING:
PRE-EXPERIMENT
CLASSROOM PRESENTATION
Version 0.7 © 2007 CELEST
Make Your Memory Stronger:
“Why Would I Do That?”
Ok, time to write
this down:
617-569-5581
You’re sooo cute!
Call me:
617-559-5581
Scenario:
…..
617-???-?581 …
???-???-????
30 seconds later
Scenario:
Every
Everyday life
Sure! I’ll REMEMBER!
(haha…a piece of
cake!)
Version 0.7 © 2007 CELEST
Don’t Let This Happen to You!
Result:
In a few brief exercises, you will explore
how to improve your memory and not
to forget the important stuff!
Version 0.7 © 2007 CELEST
Why Is Memory Important?
Version 0.7 © 2007 CELEST
Sequence Learning Focuses on
Order and Memory
Our memory allows us to store and recall
information in a specific order, or
sequence.
Why would order in memory be
important?
Version 0.7 © 2007 CELEST
Types of Memory that Support
Sequence Learning
What is the role of working memory?
What is the role of long-term memory?
2 years ago ……………………………… 2 minutes ago
time
Version 0.7 © 2007 CELEST
Working Memory
Working memory has a very limited
capacity (a span of only a few items.)
Working memory has a short duration.
Working memory is extremely fragile.
Through rehearsal, working memory will
become long-term memory.
By the way….
What was the girl’s phone number again???
Version 0.7 © 2007 CELEST
A Working Memory Challenge
The next slide will present you with a list of
words. Try to remember them in order.
After the list disappears,
countdown from 10 to 1.
Version 0.7 © 2007 CELEST
car
glass
apple
football
chair
Version 0.7 © 2007 CELEST
Count backwards from 10!
Version 0.7 © 2007 CELEST
Memory Check
Write down the words in order!
How did you do? Check it out:
car, glass, apple, football, chair
Version 0.7 © 2007 CELEST
Working Memory Challenge Results
How many items did you recall?
Was it difficult to remember the words in
the right order?
What strategies did you use to remember
the ordered list?
Version 0.7 © 2007 CELEST
Let’s Explore…Chunking!
Events are stored and recalled in groups,
or chunks. Chunking by item feature and
item order can be used to increase the
capacity of memory.
A physical representation of chunking:
Is it easier to hold 30 loose
tennis balls in your
arms?…
Or is it easier to hold 30
tennis balls that you have
grouped into two bags?
??
?
!!!!!
Chunk
#1
When do you use chunking?
Version 0.7 © 2007 CELEST
Chunk
#2
How Do Chunking, Working Memory,
and Long-Term Memory
Relate to Each Other?
Version 0.7 © 2007 CELEST
Working Memory Capacity
George Miller (1956)
noticed that working
memory seems capable of
holding seven items, plus
or minus two (7 +/- 2).
# of items
remembered
The capacity, or span, of an individual’s
working memory is the number of items that
a person can retain and recall.
# of items in list
An individual’s memory span can vary!
What factors can affect someone’s span?
Version 0.7 © 2007 CELEST
Typical Features of Working Memory
Performance
Primacy- the effect of easily remembering
the first few items in a sequence
Recency- the effect of easily remembering
the last (most recent) few items in a
sequence
Middle items are more difficult to remember.
Version 0.7 © 2007 CELEST
Working Memory Summary
• Working memory is extremely fragile, does not last for a
long period of time, and can only hold a limited number of
items (span)
• Several factors prevent us from keeping some information
in mind
A lot of things to remember
Distractions
Time
• Some strategies help us keep things in mind
Chunking (learning information from groups of items)
- very effective for short-term learning
Distributed practice (rehearsing information from
different chunks)
- very effective for long-term learning
Version 0.7 © 2007 CELEST
START THE EXPERIMENT!
Take the Working Memory
Span Challenge!
You will be asked to remember a sequence of
numbers without writing them down (item
and order)
Because working memory has a limited
capacity, everyone will eventually make
mistakes on this challenge
JUST DO YOUR BEST!
Version 0.7 © 2007 CELEST
Accessing the
Sequence Learning Software
Version 0.7 © 2007 CELEST
Using the Sequence Learning Software
Version 0.7 © 2007 CELEST
The Sequence Learning Experiment:
3 Phases
• PHASE 1: Span Determination
Determines how many numbers you can
recall from your working memory.
• PHASE 2: Challenge Recall
Challenges you by adding one more number
to your span!
• PHASE 3: Grouped Recall
Challenges you by adding one more number
to your span and presents the numbers to
you in chunks!
Version 0.7 © 2007 CELEST
Introduction to Experiment Phases:
Using the Software
Version 0.7 © 2007 CELEST
Sample Results Summary
Version 0.7 © 2007 CELEST
END OF PRE-EXPERIMENT
PRESENTATION
Version 0.7 © 2007 CELEST
MEM
RY
SEQUENCE LEARNING:
POST-EXPERIMENT
CLASSROOM PRESENTATION
Version 0.7 © 2007 CELEST
Results Summary
Version 0.7 © 2007 CELEST
Ways to Improve Memory!
What do you think are some ways you could have
improved your memory during the experiment?
1)
2)
3)
4)
Chunk it.
Write it down.
Pay attention.
Rehearse information in
working memory.
5) Decrease unnecessary
stress.
6) Challenge yourself to
increase your memory
span. (think about
thinking!)
7) Relate new information to
prior knowledge and
experiences.
8) Define the information in
a different way.
9) Talk to other people
about the information or
teach it to others.
10) Get enough sleep!
Version 0.7 © 2007 CELEST
END OF POST-EXPERIMENT
PRESENTATION
Version 0.7 © 2007 CELEST
MEM
RY
SEQUENCE LEARNING:
ADVANCED MODELING
CLASSROOM PRESENTATION
Version 0.7 © 2007 CELEST
What is Working Memory?
Active sketchpad for manipulating
items in memory
% correct
A simple short term memory store
Store items AND order
Serial Position
The original three component
model of working memory of
Baddeley and Hitch 1974
Serial recall curves help
us to understand some of
the ways in which working
memory works.
Version 0.7 © 2007 CELEST
Observations About
Working Memory
Memory Span
Memory has a limited capacity
Primacy
The first items are more likely to be correctly recalled
Recency
The last items are also more likely to be correctly recalled
Temporal Grouping
Items presented close to one another in time can form
‘chunks’
Version 0.7 © 2007 CELEST
Primacy, Recency, and Bowing
PRIMACY
RECENCY
% correct
BOW
When both primacy
and recency occur,
the serial recall
curve exhibits
bowing.
Serial Position
Did you see primacy, recency or both (bowing)
when using the Sequence Learning Software?
Version 0.7 © 2007 CELEST
Temporal Grouping
What are temporal grouping effects?
For grouped lists, the effects on the serial position
curve are clear:
I2
150ms
I3
750ms
150ms
I1
I2
I3
150ms 150ms
% correct
I1
Serial Position
Version 0.7 © 2007 CELEST
750ms
I1
I2
150ms
I3
150ms
Chunking
.A
C
S
N
B
F
I+
..
Version 0.7 © 2007 CELEST
Now Pay Attention to
Government Agencies….
.A
C
S
N
B
F
I+
..
Version 0.7 © 2007 CELEST
Chunking
When items are chunked to form groups
they are easier to remember
3 chunks
are easier to remember than
9 items
Version 0.7 © 2007 CELEST
Relevance to Models of
Working Memory
Observations about span, primacy,
recency and chunking should constrain
any biological model of working
memory
These phenomena should be qualitatively
and quantitatively explained
Version 0.7 © 2007 CELEST
Modeling Working Memory
Working memory models are comprised of
fields
Each field contains many nodes which can
represent a single neuron or a populations of
neurons
Some fields represent inputs
Other fields actually store the inputs in
working memory
Version 0.7 © 2007 CELEST
Graphing conventions
Modulators
Learned weights
Excitation
+
Inhibition
or
or
Version 0.7 © 2007 CELEST
Types of connections
Convergent
“In-star”
Divergent
“Out-star”
Version 0.7 © 2007 CELEST
Types of connections
Feedforward
Feedback
Version 0.7 © 2007 CELEST
Properties of a Neuron
or Neural Population
Tonic oscillations
Input-based phasic excitation
Passive decay of activation
-
Competitive interaction
-
Inhibitory
+
-
-
+
-
+
-
+
-
on-center, off surroundings
Excitatory
+
off-center, on surroundings
Version 0.7 © 2007 CELEST
+
-
+
The STORE Model: Inputs
1
3
This field (input field, or F0)
contains n nodes. Each node
becomes active when a specific
input is presented.
I1
I2
I3
I4
…
input field
Version 0.7 © 2007 CELEST
In
F0
Additional Input Representations
I
1
3
I1
Version 0.7 © 2007 CELEST
I2
I3
I4
…
In
F0
Adding Working Memory
working memory field
I
X1
X2
X3
X4
…
Xn
F1
I1
I2
I3
I4
…
In
F0
1
3
input field
Version 0.7 © 2007 CELEST
Item Working Memory
This network stores the items that have been presented in
a working memory.
I
X1
X2
X3
X4
…
Xn
F1
I1
I2
I3
I4
…
In
F0
Which items were presented to this model?
In what order were the items presented?
Version 0.7 © 2007 CELEST
What about order?
Can order be represented by a gradient
of activations? YES!!
HIGHER activations EARLIER items
LOWER activations LATER items
Which items were presented?
X1
Xwere
X3 the
X4 items
… presented?
Xn
In what order
2
Version 0.7 © 2007 CELEST
Item and Order with
Competitive Inhibition
Accurate item and order information can be
retained simultaneously by adding competition
and recurrent self-excitatory connections.
I
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
Competitive Inhibition
Whenever any node in F1 becomes active:
X becomes active
X inhibits all nodes in F
1
Xn
excite themselves to maintain activation
I
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
Expanding Working Memory
This network is not robust in the face of varied input
timings
Another field (F2) can be introduced to provide stability
Ic
Y1
Y2
Y3
Y4
…
F2
Yn
I
I
Ic stands for
I complement
If I = 0, Ic = 1
If I = 1, Ic = 0
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
The Presentation of an Input
When an input is presented:
Nodes in F0 and F1 become active, storing the input
F2 nodes are gated off so they do not change
Ic
Y1
Y2
Y3
Y4
…
F2
Yn
I
I
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
Removal of Input
When no inputs are present:
Nodes in F0 are no longer active
Nodes in F1 remain active and do not change
F2 nodes become active and track F1 nodes
Ic
Y1
Y2
Y3
Y4
…
F2
Yn
I
I
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
Adding a Second Input
When another input is presented:
F1 nodes start to represent both inputs, in order
F2 nodes help maintain that order
Ic
Y1
Y2
Y3
Y4
…
F2
Yn
I
I
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
Removing The Second Input
When another input is presented:
Nodes in F0 are no longer active
Nodes in F1 remain active and do not change
F2 nodes become active and track F1 nodes
Ic
Y1
Y2
Y3
Y4
…
F2
Yn
I
I
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
One More Time….
When another input is presented:
F1 nodes start to represent all three inputs, in order
F2 nodes help maintain that order
Ic
Y1
Y2
Y3
Y4
…
F2
Yn
I
I
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
One More Time….
When the third input is removed:
Nodes in F0 are no longer active
Nodes in F1 remain active and do not change
F2 nodes become active and track F1 nodes
Ic
Y1
Y2
Y3
Y4
…
F2
Yn
I
I
X1
X2
X3
X4
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
The Math Behind the Madness
The rate of change of activation for each node
in F1 is described by:
dxi
AIi i ++ xxi i −− xxixix) I
dt = ( AI
I
X1
X2
X3
Xi
…
Xn
I1
I2
I3
Ii
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
The Math Behind a New Field
When another field is added, two differential equations are
needed to describe how nodes in each of the fields
change:
dyi
c
=
(
x
−
y
)
I
i
i
dt
Ic
Y1
Y2
Y3
Yi
…
F2
Yn
I
dxi
dt = ( AIi + yi − xix ) I
I
X1
X2
X3
Xi
…
Xn
I1
I2
I3
I4
…
In
Version 0.7 © 2007 CELEST
-
X
F1
F0
Capabilities of the Model
Stores information about temporally ordered sequences
while exhibiting:
Primacy
Recency
Bowing
Temporal Chunking
A few shortcomings of this model:
Incapable of simulating other chunking phenomena
This model is for storage only, it does not deal with
accessing working memory, or readout
Version 0.7 © 2007 CELEST
LISTPARSE: An Advanced Model
Pi
LISTPARSE was developed to
overcome shortcomings of
the STORE model
G
Da
G
Di
Db
G
V
Ti
A
B
R
Chunking and other
phenomena are captured
by LISTPARSE
The Sequence Learning
application uses the
working memory from this
model
Si
Fi
Cj
Xi
Yi
F
Ia
Version 0.7 © 2007 CELEST
Ii
Ib
LISTPARSE Working Memory
LISTPARSE contains
chunk cells that can
store learned
sequences
List
Category
Chunks
CL
2/3
Xb
4
Ya
Yb
5/6
Ia
Ib
CK
List
Category
Learning
Working
Memory
Storage
This working memory is
similar to the one
described earlier
Version 0.7 © 2007 CELEST
Xa
Volitional
Gain Control
F
Key Parameters of LISTPARSE
In the Model Layer, it is easy to alter the function of working
memory
This is done by adjusting parameters of LISTPARSE such as:
The strength and function of bottom-up connections
The strength and function of top-down connections
Volitional gain control (“attention” in the Model Layer)
Human memory can work differently in various states:
When we’re paying attention
When we’re tired, hungry, etc.
Version 0.7 © 2007 CELEST