Curve Fitting
Download
Report
Transcript Curve Fitting
Curve Fitting and Regression
EEE 244
Descriptive Statistics in MATLAB
• MATLAB has several built-in commands to compute
and display descriptive statistics. Assuming some
column vector s:
– mean(s), median(s), mode(s)
• Calculate the mean, median, and mode of s. mode is a part of the
statistics toolbox.
– min(s), max(s)
• Calculate the minimum and maximum value in s.
– var(s), std(s)
• Calculate the variance (square of standard deviation) and standard
deviation of s
• Note - if a matrix is given, the statistics will be
returned for each column.
Measurements of a Voltage
Drain Current in mA at
intervals t=10am-10pm
6.5
6.3
6.2
6.5
6.2
6.7
6.4
6.4
6.8
Find mean, median, mode , min, max,
variance and standard deviation using
appropriate Matlab functions.
Histograms in MATLAB
• [n, x] = hist(s, x)
– Determine the number of elements in each bin of data in s. x is
a vector containing the center values of the bins.
– Open Matlab help and run example with 1000 randomly
generated values for s.
• [n, x] = hist(s, m)
– Determine the number of elements in each bin of data in s using
m bins. x will contain the centers of the bins. The default case
is m=10
– Repeat the previous example with setting m=5
• Hist(s) ->histogram plot
EEE 244
REGRESSION
Linear Regression
• Fitting a straight line to a
set of paired observations:
(x1, y1), (x2, y2),…,(xn, yn).
y=a0+a1x+e
a1- slope
a0- intercept
e- error, or residual,
between the model and the
observations
6
Linear Least-Squares Regression
• Linear least-squares regression is a method to
determine the “best” coefficients in a linear model
for given data set.
• “Best” for least-squares regression means minimizing
the sum of the squares of the estimate residuals. For
a straight line model, this gives:
n
n
Sr e yi a0 a1 xi
2
i
i1
i1
2
Least-Squares Fit of a Straight Line
• Using the model:
y a0 a1x
the slope and intercept producing the best fit
using:
can be found
a1
n xi yi xi yi
n x
a0 y a1 x
2
i
x
2
i
Example
V
(m/s)
F
(N)
a1
i
xi
yi
(xi)2
x iy i
1
10
25
100
250
2
20
70
400
3
30
380
900
1400
11400
4
40
550
1600
22000
5
50
610
2500
30500
6
60
1220
3600
73200
7
70
830
4900
58100
8
80
1450
6400
116000
360
5135
20400
312850
n xi yi xi yi
n x
2
i
x
2
i
8312850 3605135
820400 360
2
19.47024
a0 y a1 x 641.875 19.47024 45 234.2857
Fest 234.2857 19.47024v
Standard Error of the Estimate
• Regression data showing (a) the spread of data around the mean
of the dependent data and (b) the spread of the data around the
best fit line:
• The reduction in spread represents the improvement due to
linear regression.
MATLAB Functions
• MATLAB has a built-in function polyfit that fits a
least-squares nth order polynomial to data:
– p = polyfit(x, y, n)
•
•
•
•
x: independent data
y: dependent data
n: order of polynomial to fit
p: coefficients of polynomial
f(x)=p1xn+p2xn-1+…+pnx+pn+1
• MATLAB’s polyval command can be used to compute
a value using the coefficients.
– y = polyval(p, x)
Polyfit function
• Can be used to perform REGRESSION if the
number of data points is a lot larger than the
number of coefficients
– p = polyfit(x, y, n)
•
•
•
•
x: independent data (Vce, 10 data points)
y: dependent data (Ic)
n: order of polynomial to fit n=1 (linear fit)
p: coefficients of polynomial (two coefficients)
f(x)=p1xn+p2xn-1+…+pnx+pn+1
Polynomial Regression
•
•
The least-squares procedure
can be readily extended to fit
data to a higher-order
polynomial. Again, the idea
is to minimize the sum of the
squares of the estimate
residuals.
The figure shows the same
data fit with:
a) A first order polynomial
b) A second order polynomial
Process and Measures of Fit
• For a second order polynomial, the best fit would mean
minimizing:
n
n
Sr e yi a0 a1 xi a x
i1
2 2
2 i
2
i
i1
• In general, this would mean minimizing:
n
n
Sr e yi a0 a1 xi a x
2
i
i1
2
2 i
i1
m 2
m i
a x
EEE 244
INTERPOLATION
Polynomial Interpolation
• You will frequently have occasions to estimate
intermediate values between precise data points.
• The function you use to interpolate must pass
through the actual data points - this makes
interpolation more restrictive than fitting.
• The most common method for this purpose is
polynomial interpolation, where an (n-1)th order
polynomial is solved that passes
through
n data
2
n1
points: f (x) a1 a2 x a3 x an x
MATLAB version :
f (x) p1 x n1 p2 x n2
pn1 x pn
Determining Coefficients using Polyfit
• MATLAB’s built in polyfit and polyval
commands can also be used - all that is
required is making sure the order of the fit for
n data points is n-1.
Newton Interpolating Polynomials
• Another way to express a polynomial
interpolation is to use Newton’s interpolating
polynomial.
• The differences between a simple polynomial
and Newton’s interpolating polynomial for
first and second order interpolations are:
Order
1st
2nd
Simple
f1 (x) a1 a2 x
f2 (x) a1 a2 x a3 x 2
Newton
f1 (x) b1 b2 (x x1 )
f2 (x) b1 b2 (x x1 ) b3(x x1 )(x x2 )
Newton Interpolating Polynomials
(contd.)
• The first-order Newton
interpolating polynomial may
be obtained from linear
interpolation and similar
triangles, as shown.
• The resulting formula based
on known points x1 and x2 and
the values of the dependent
function at those points is:
f x2 f x1
f1 x f x1
x x1
x2 x1
Newton Interpolating Polynomials
(contd.)
• The second-order Newton
interpolating polynomial
introduces some curvature to the
line connecting the points, but
still goes through the first two
points.
• The resulting formula based on
known points x1, x2, and x3 and
the values of the dependent
function at those points is:
f x3 f x2 f x2 f x1
f x2 f x1
x 3 x2
x2 x1
f2 x f x1
x
x
1
x x1 x x2
x2 x1
x3 x1
Lagrange Interpolating Polynomials
• Another method that uses shifted value to express an
interpolating polynomial is the Lagrange interpolating
polynomial.
• The differences between a simply polynomial and
Lagrange interpolating polynomials for first and second
order polynomials is:
Order
1st
2nd
Simple
f1 (x) a1 a2 x
f2 (x) a1 a2 x a3 x 2
Lagrange
f1 (x) L1 f x1 L2 f x2
f2 (x) L1 f x1 L2 f x2 L3 f x3
where the Li are weighting coefficients that are
functions of x.
Lagrange Interpolating Polynomials
(contd.)
• The first-order Lagrange
interpolating polynomial may be
obtained from a weighted
combination of two linear
interpolations, as shown.
• The resulting formula based on
known points x1 and x2 and the
values of the dependent
function at those points is:
f1 (x) L1 f x1 L2 f x2
x x2
x x1
L1
, L2
x1 x2
x2 x1
x x2
x x1
f1 (x)
f x1
f x2
x1 x2
x2 x1
x x0
x x1
f1 ( x)
f ( x0 )
f ( x1 )
x0 x1
x1 x0
x x0 x x2
x x1 x x2
f 2 ( x)
f ( x0 )
f ( x1 )
x0 x1 x0 x 2
x1 x0 x1 x 2
x x0 x x1
f ( x2 )
x2 x0 x2 x1
•As with Newton’s method, the Lagrange version has an
estimated error of:
n
Rn f [ x, xn , xn 1 , , x0 ] ( x xi )
i 0
23
Figure 18.10
24
Lagrange Interpolating Polynomials
(contd.)
• In general, the Lagrange polynomial
interpolation for n points
is:
n
fn1 xi Li x f xi
i1
where Li is given by:
x xj
Li x
x xj
j1 i
n
ji