Transcript ppt - IISc

Supratim Ray
[email protected]
•
Biophysics of Action Potentials
•
•
•
•
Techniques
•
•
•
Passive Properties – neuron as an electrical circuit
Passive Signaling – cable theory
Active properties – generation of action potential
Random Variables and Poisson Distribution
Correlations – various techniques
Journal Session – Kohn and Smith, 2005, JNS
2
•
Biophysics of Action Potentials
•
•
•
•
Techniques
•
•
•
Passive Properties – neuron as an electrical circuit
Passive Signaling – cable theory
Active properties – generation of action potential
Random Variables and Poisson Distribution
Correlations – various techniques
Journal Session – Kohn and Smith, 2005, JNS
3
 Kandel, Schwartz and Jessell, Principles of Neural Science, Chapter 27
 Sheldon M. Ross, Stochastic Processes, Chapters 1-2 or
 Papoulis, Probability, Random Variables, and Stochastic Processes,
Chapters 1-4
4
 Start reading the Journal session paper. Most techniques used
in the paper will be covered here.
 Spike Data and several Matlab codes will be emailed to you.
Many codes are related to the topics covered here, and are part
of the Homework assignment.
5
6
Data acquisition and Questions
• 10x10 grid of electrodes
• 400 microns tip-to-tip spacing
• V1 cortex
7
Attended Side
Receptive field
8
Figure 27-11: Kandel, Schwartz and Jessell
9
http://www.youtube.com/watch?v=8VdFf3egwfg
10
Figure 27-14: Kandel, Schwartz and Jessell
11
>> load SpikeData_022309SRC_001_elecs49_51.mat
>> rasterplot(SpikeData{1}); xlim([-0.4 0.8]);
12
>> load SpikeData_022309SRC_001_elecs49_51.mat
>> [H,timeVals]=psthplot(SpikeDataList{1},1,[-0.4 0.8],4);
>> plot(timeVals,H); xlim([-0.4 0.8]);
13
 In rate code, the precise timing of the spike
does not matter. The only metric of interest is
the total number of spikes in a given time
interval.
 In temporal code, the precise timing of spikes
is also important. In particular, if sufficient
number of neurons fire synchronously, it has a
larger impact on the downstream neuron.
14
15
Introduction, Basic Concepts
 Characterized by the property that its observation under a given set of
circumstances does not always lead to the same observed outcome, but
rather to different outcomes in such a way that there is statistical
regularity.
 Sample Space (S): Set of all possible outcomes of an experiment.
 Events: A subset of a sample space, is said to occur if the outcome of the
experiment is an element of that subset. Individual members of the
sample space are called sample points or elementary events.
 Due to certain consistency problems sometimes it is not possible to
assign probabilities to all possible subsets of the sample space. Thus, we
consider a certain class F of subsets of S satisfying certain axioms,
declare the members of F as events, and then define the probability of an
event.
16
 Let F be a class of subsets of a sample space S. F is said to be a sigma
field of events if
 S is in F
 If A is in F, then so is the complement of A.
 If An is a sequence of elements of F, then the union of the Ans is in F.
 Examples
 F = {phi, S} is a trivial sigma field.
 F = {phi, A, Ac, S} is a sigma-field, where A is any subset of S and Ac denotes
the complement of A.
17
 A set function P defined on a sigma field F of subsets of
the sample space S of a random experiment is called a
probability measure if it satisfies the following axioms.
 0 ≤ P(A) ≤ 1 for all A in F.
 P(S) = 1;
 Let A1, A2, …. be mutually exclusive (disjoint) events in F,
i.e., the intersection of Ai and Aj is empty (for i≠j), then
18
 Experiment: A coin is tossed once and we report the
side of the coin.
 Sample Space S = {H,T}
 Sigma Field 1 = {Φ, S}
 Sigma Field 2 = {Φ, {H}, {T}, S}
 Probability Measure on Sigma Field 2
 P({H}) = p
 P({T}) = q = 1-p.
19
 Let A and B be two events on a sample space S, on the sigma
Field F of subsets of which is defined a probability P. the
conditional probability of the event B, given the event A,
denoted by P(B/A) is defined by
If P(A) > 0, and is undefined if P(A)=0.
 Independence of events: Two events A and B are said to be
independent if
20
 Is a function that maps each element of the sample space S to a
number on the real line.
 We can ask questions of the following type: What is the probability
that a random variable X is less than a number x? Because
probabilities are assigned to events, we need to reformulate the
question in terms of the events in the sigma field F.
 Formal definition: Let S be the sample space of a random experiment
and F be the sigma field of events. A finite, single valued function X
which maps S into R is called a random variable if the inverse images
under X of all intervals (-∞,x] are events, i.e., if
X ((-¥, x]) = [X £ x] = {w : X(w ) £ x} Î F
-1
21
 Coin Tossing Experiment.
 S = {T,H}.
 F = {Φ, {H}, {T}, S}
 Define a function X on S as follows: X(T)=0, X(H)=1.
-1
X ((-¥, x])
=
Φ
if x < 0
{T}
if 0≤x<1
S
if x≥1
Thus X is a random variable
22
 Let X be a random variable. Define a function F on R by
F(x) = P([X £ x]) = P({w : X(w ) £ x})
The function is called the distribution function (DF) or the
cumulative distribution function (CDF) of the Random Variable.
 Probability Distribution Function (pdf) is the derivative of F with
respect to x. If the random variable is discrete, we get the density
concentrated at certain points. It is also called the probability
mass function (pmf) in that case.
 Exercise: cdf and pdf of a single coin toss described before.
23
 Experiment: Suppose you take a coin and flip it N times. Suppose also
that the probability of getting a heads is p.
 Sample Space has 2^N possible sequences.
 Define a random variable X which is the number of times heads
appear in a sequence.
 Random variable X takes a value k with probability
24
Taken from http://en.wikipedia.org/wiki/Binomial_distribution
25
Taken from http://en.wikipedia.org/wiki/Normal_distribution
26
 Expected Value (Mean)
 Variance
http://en.wikipedia.org/wiki/Expected_value
http://en.wikipedia.org/wiki/Variance
 Binomial Distribution: Mean: np, Var: np(1-p)
 Normal Distribution: Mean:μ, Var:
27
 Step 1
 Find time intervals in which firing rate is about the same.
 Baseline Period: [-300 -44] ms (256 ms duration)
 Stimulus Period: [150 406] ms (256 ms duration)
28
 Step 2
 Define random variables XBL and XST, which give the total number of
spikes in a given time interval (baseline and stimulus, respectively).
29
 Step 2
 >> XBL = getSpikeCounts(SpikeDataList{1},[-0.3 -0.044]);
 >> XST = getSpikeCounts(SpikeDataList{1},[0.15 0.406]);
30
 Step 2
 >> plot(XBL,’g’); hold on; plot(XST,’r’); plot(XBL,’go’); plot(XST,’ro’);
31
 Step 3
 Compute the probability mass function (pmf)
 >> [pBL,cBL] = getPMF(XBL); plot(cBL,pBL,’g’)
32
Suppose events (spikes) are occurring at a constant rate (λ)
 Independent increments: if we count the number of spikes in two nonoverlapping intervals, they should be independent: knowledge of the
number of spikes in one interval should not reveal any information about
the number of spikes in another interval (which does not overlap with the
first)
 Stationary increments: the number of spikes in a given interval should
only depend on the length of the interval, but not on its position.
 Probability of having a single spike in a small interval h is λh.
 Probability of having more than 1 spike in a small interval h is extremely
small.
It can be shown that in this case, the number of spikes in an interval of
length t follows a Poisson distribution with parameter λ.t.
33
 A random variable X is said to have Poisson distribution with parameter
μ if the pmf of X is given by
P(X = k) = e
 E(X)
-m
m
k
k!
k=0,1,2,3,…
=μ
 Var(X) = μ
 Fano Factor = Var/Mean=1
34
 A Poisson distribution is applicable in many situations in which some
kind of “event” or “change of state” or “flaw” or “failure” occurs in a
manner thought of intuitively as “at random.”
 Law of rare events (or Poisson Limit Theorem) gives a Poisson
approximation to a Binomial Distribution, under certain circumstances.
Proof: See http://en.wikipedia.org/wiki/Poisson_limit_theorem
35
 >> mBL = mean(XBL); vBL = var(XBL);
 >> mST = mean(XST); vBL = var(XST);
 mBL = 2.0, vBL = 1.95, Fano Factor = 0.97
 mST = 6.05, vST = 8.56, Fano Factor = 1.41
 If XBL was indeed Poisson Distributed with a mean of mBL, the
theoretical pmf would have been:
 >> gBL = exp(-mBL) * ((mBL).^cBL) ./ factorial(cBL)
 Similarly,
 >> gST = exp(-mST) * ((mST).^cST) ./ factorial(cST)
 Now plot gBL and gST along with pBL and pST
 >> plot(cBL,pBL,’go’); hold on; plot(cBL,pBL,’g’);
 >> plot(cBL,gBL,’g*’); hold on; plot(cBL,gBL,’g--’);
36
37
38
 A continuous random variable X is said to have an exponential
distribution with parameter λ, λ>0, if its pdf is given by
f (x) =
l e- l x
x≥0
0
x<0
 Equivalently, its cdf is given by
F(x) =
1- e- l x x≥0
0
x<0
 E(X) = 1/λ, var(X)=1/λ2, so CV = std/mean = 1
39
 Exponential distribution is memoryless, where memoryless
property is defined as:
P{X>s+t|X>t} = P{X>s}
for s,t≥0
 Exponential distribution is the unique distribution possessing this
property
 Hazard or failure rate function λ(t) is defined as
f (t)
l (t) =
F(t)
 For an exponential distribution, the hazard function is constant.
40
 >> isiBL = getISIs(SpikeDataList{1},[-0.3 -0.044]);
 Homework: use the hist command to plot the ISI pdf, and also plot
the theoretical exponential distribution with the same mean as isiBL.
41
 >> isiBL = getISIs(SpikeDataList{1},[-0.3 -0.044]);
 Homework: use the hist command to plot the ISI pdf, and also plot
the theoretical exponential distribution with the same mean as isiBL.
42