Low-Dimensional Chaotic Signal Characterization Using

Download Report

Transcript Low-Dimensional Chaotic Signal Characterization Using

Low-Dimensional Chaotic
Signal Characterization
Using Approximate Entropy
Soundararajan Ezekiel
Matthew Lang
Computer Science Department
Indiana University of Pennsylvania
Roadmap
Overview
Introduction
Basics and Background
Methodology
Experimental Results
Conclusion
Overview
Many signals appear to be random
May be chaotic or fractal in nature
Wary of noisy systems
Analysis of chaotic properties is in order
Our method - approximate entropy
Introduction
Chaotic behavior is a lack of periodicity
Historically, non-periodicity implied
randomness
Today, we know this behavior may be
chaotic or fractal in nature
Power of fractal and chaos analysis
Introduction
Chaotic systems have four essential
characteristics:




deterministic system
sensitive to initial conditions
unpredictable behavior
values depend on attractors
Introduction
Attractor's dimension is useful and good
starting point
Even an incomplete description is useful
Basics and Background
Fractal analysis
Fractal dimension defined for set whose
Hausdorff-Besicovitch dimension exceeds
its topological dimensions.
Also can be described by self-similarity
property
Goal: find self-similar features and
characterize data set
Basics and Background
Chaotic analysis
Output of system mimics random behavior
Goal: determine mathematical form of
process
Performed by transforming data to a
phase space
Basics and Background
Definitions
Phase Space: n dimensional space, n is
number of dynamical variables
Attractor: finite set formed by values of
variables
Strange Attractors: an attractor that is
fractal in nature
Basics and Background
Analysis of phase space
Determine topological properties




visual analysis
capacity, correlation, information dimension
approximate entropy
Lyapunov exponents
Basics and Background
Fractal dimension of the attractor
Related to number of independent
variables needed to generate time series
number of independent variables is
smallest integer greater than fractal
dimension of attractor
Basics and Background
Box Dimension
Estimator for fractal dimension
Measure of the geometric aspect of the
signal on the attractor
Count of boxes covering attractor
Basics and Background
Information dimension
Similar to box dimension
Accounts for frequency of visitation
Based on point weighting - measures rate
of change of information content
Methodology
Approximate Entropy is based on
information dimension
Embedded in lower dimensions
Computation is similar to that of correlation
dimension
Algorithm
Given a signal {Si}, calculate the
approximate entropy for {Si} by the
following steps. Note that the approximate
entropy may be calculated for the entire
signal, or the entropy spectrum may be
calculated for windows {Wi} on {Si}. If the
entropy of the entire signal is being
calculated consider {Wi} = {Si}.
Algorithm
Step 1: Truncate the peaks of {Wi}. During
the digitization of analog signals, some
unnecessary values may be generated by
the monitoring equipment.
Step 2: Calculate the mean and standard
deviation (Sd) for {Wi} and compute the
tolerance limit R equal to 0.3 * Sd to
reduces the noise effect.
Algorithm
Step 3: Construct the phase space by
plotting {Wi} vs. {Wi+τ}, where τ is the time
lag, in an E = 2 space.
Step 4: Calculate the Euclidean distance
Di between each pair of points in the
phase space. Count Ci(R) the number of
pairs in which Di < R, for each i.
Algorithm
Step 5: Calculate the mean of Ci(R) then
the log (mean) is the approximate entropy
Apn(E) for Euclidean dimension E = 2.
Step 6: Repeat Steps 2-5 for E = 3.
Step 7: The approximate entropy for {Wi}
is calculated as Apn(2) - Apn(3).
Noise
HRV (young subject)
HRV (older subject)
Stock Signal
Seismic Signal
Seismic Signal
Conclusion
High approximate entropy - randomness
Low approximate entropy - periodic
Approximate entropy can be used to
evaluate the predictability of a signal
Low predictability - random