Random Data 1 ---- LL Koss

Download Report

Transcript Random Data 1 ---- LL Koss

Random Data
Workshops 1 & 2
L. L. Koss
Random Data 1 ---- L.L. Koss
Random Data Analysis
Random Data –L.L. Koss
Random Data--- L.L. Koss
Methods of characterizing random
data:
1. Probability
2. Correlation
3. Frequency spectra
System Modeling
1. Frequency response functions
2. Auto-regressive models
3. Impulse response functions
Random Data—L.L. Koss
1. Data that can be represented by a
mathematical function is called
“Deterministic”. Vibration of a spring
mass system from an initial displacement.
2. Data that can not be represented by an
explicit mathematical function is called
“Random”. Wave height of ocean wave.
3. A time series is a random function, e.g.
X(t), of an independent variable t where t
is time.
Random data – L.L. Koss
•
Stochastic Process
1. If we examine random data, h(t), over
different periods of time there may not be
any visual similarities over different time
periods of observation, T1 and T2.
2. Many transducers are placed in the field to
observe the random process h1(t), h2(t),
… hn(t). Let n approach infinity.
Random Data--- L.L. Koss
3. Each of the hi(t) is called a sample time
function. The collection of time functions is
called an ensemble.
4. The collection of all possible time functions
that the random process may have produced
is called a “stochastic process”.
5. Usually, only a small number of time
records are available to describe the process
and they last only for a finite time.
6. When can sample time records be used to
describe a process?
Random Data--- L.L. Koss
Classification of random processes:
Random Process
Stationary
Ergodic
Non-ergodic
Non-stationary
Special classifications
Random Data--- L.L. Koss
Random Data--- L.L. Koss
• If m(t1) and R(t1+) vary in amplitude as t1
is changed, the random process is said to be
“non-stationary”.
• If m(t1) and R(t1+) do not vary in
amplitude as t1 is changed, the random
process is said to be weakly “stationary” or
stationary in the wide sense. Many
processes fit this description. For stationary
data m and R () are independent of
absolute time.
Random Data--- L.L. Koss
•
•
•
•
Ergodic Random Process
Let us examine the “kth” time record and
compute m and R () over time “t” rather
than over an ensemble.
m(k)= hk(t)dt as t
becomes large
R(k, )= hk(t)*hk(t+ ) dt “”””””””””””
If the random process is stationary and m(k)
and R(k, ) are independent of “k” (do not
differ) and are equal to the ensemble
averaged values the random process is said
to be “ergodic”.
Random Data--- L.L. Koss
• For ergodic random processes the time
averaged mean value and autocorrelation
function are equal to the corresponding
ensemble averaged values.
• Thus, analysis of a single sample time
function gives results that describe the
random process!!
Random Data--- L.L. Koss
• Example of nearly rectangular/ flat
distribution
Random Data--- L.L. Koss
• Gaussian or Normal Distribution
• P(x)=1/(sqrt(2))*exp-((x-m)2/2 2)
• Where m is the mean value and  is the
standard deviation.
• For all probability density functions:
• p(x) dx = 1--- integration from – to +
infinity
Random Data--- L.L. Koss
E- expectation operator: The expected or mean value
of a random variable. The expected value of wave
height is given by
E[h(t)] and is an average over an ensemble of time
functions. For a stationary process
E[(h(t)]= hp(h)dh from –to + infinity
2
2
Mean square value: E[h(t) ]
E[h(t)2]= h 2 p(h)dh from – to +infinity 2
If the process is ergodic then the average and mean
square value can be calculated along a sample
time function also
Random Data--- L.L. Koss
• The variance of the process, 2, or standard
deviation, , is given by
• 2 =E[(h-E[h]) 2 ]
• 2 =E[h2 ]-(E[h]) 2 or
• Variance = Mean square value – mean
squared
Random Data--- L.L. Koss
• Joint Probability Distributions
• SISO- Single Input Single Output System
Random Data--- L.L. Koss
Random Data--- L.L. Koss
• First order probability density function can
be obtained from second order probability
density function by integrating out the
variable not required e.g.
• P(h)=p(h,y)dy from – to + infinity
• Conditional probability density function
• Given a y value what is the probability of h?
• P(h|y)=p(h,y)/p(y)
Random Data--- L.L. Koss
• Second order Gaussian distribution
Random Data--- L.L. Koss
• MISO – Multi-Input Single- Output
• MIMO – Multi-Input Multi-Output
•
Random Data--- L.L. Koss
• Higher order probability density functions
• P(x1,u,h,v) – 5 Dimensions
• Do relationships exist between these
variables? Are they linear ? What
frequencies exist in the time data?
• Use Correlation to assist in answering
above questions
• Ordinary correlation between two variables
• Partial correlation between inputs, inputs
and outputs
Newland—Chap 3.
Newland, D. E. (1993) “An
Introduction to Random Vibrations,
Spectral and Wavelet Analysis”.
Chapter 3 –p. 21-23
Newland—Chap 3.
• Random Structure under load
Newland —Chap 3.
Newland, D. E. (1993) “An Introduction to
Random Vibrations, Spectral and Wavelet
Analysis”.
Chapter 3 –p. 24-32
Circular Correlation Function
Recommended References
Bendat, J. S. and Piersol, A. G. (1971)
“Random data; analysis and measurement
procedures”
Bendat, J. S. and Piersol, A. G. (1980)
“Engineering applications of correlation and
spectral analysis”.