Least Squares Fitting
of Data to a Curve
Gerald Recktenwald
Portland State University
Department of Mechanical Engineering
[email protected]
These slides are a supplement to the book Numerical Methods with
Matlab: Implementations and Applications, by Gerald W. Recktenwald,
c 20002007, Prentice-Hall, Upper Saddle River, NJ. These slides are
c 20002007 Gerald W. Recktenwald. The PDF version
copyright
of these slides may be downloaded or stored or printed only for
noncommercial, educational use. The repackaging or sale of these
slides in any form, without written consent of the author, is prohibited.
The latest version of this PDF le, along with other supplemental material
for the book, can be found at www.prenhall.com/recktenwald or
web.cecs.pdx.edu/~gerry/nmm/.
Version 0.82
November 6, 2007
page 1
Overview
Fitting a line to data
Geometric interpretation
Residuals of the overdetermined system
The normal equations
Nonlinear ts via coordinate transformation
Fitting arbitrary linear combinations of basis functions
Mathematical formulation
Solution via normal equations
Solution via QR factorization
Polynomial curve ts with the built-in polyfit function
Multivariate tting
NMM: Least Squares Curve-Fitting
page 2
Fitting a Line to Data
Given m pairs of data:
(xi, yi), i = 1, . . . , m
Find the coecients and such that
F (x) = x +
is a good t to the data
Questions:
How do we dene good t?
How do we compute and after a denition of good t is obtained?
NMM: Least Squares Curve-Fitting
page 3
Plausible Fits
Plausible ts are obtained by adjusting the
slope () and intercept ( ). Here is a
graphical representation of potential ts to a
particular set of data
Which of the lines provides the best t?
5
y
4
3
2
1
NMM: Least Squares Curve-Fitting
page 4
The Residual
The dierence between the given yi value
and the t function evaluated at xi is
ri = yi F (xi)
= yi (xi + )
4
3
ri is the residual for the data pair (xi, yi).
ri is the vertical distance between the known
data and the t function.
NMM: Least Squares Curve-Fitting
page 5
Minimizing the Residual
Two criteria for choosing the best t
X
minimize
|ri|
or
minimize
ri
For statistical and computational reasons choose minimization of =
m
X
[yi (xi + )]
ri2
i=1
The best t is obtained by the values of and that minimize .
NMM: Least Squares Curve-Fitting
page 6
Orthogonal Distance Fit
An alternative to minimizing the
residual is to minimize the orthogonal
distance to the line.
P 2
di is known as the
Minimizing
Orthogonal Distance Regression
problem.
See, e.g.,
Ake Bj
ork, Numerical
Methods for Least Squares Problems,
1996, SIAM, Philadelphia.
NMM: Least Squares Curve-Fitting
y
d4
d3
d1
x1
d2
x2
x3
x4
page 7
Least Squares Fit
(1)
The least squares t is obtained by choosing the and so that
m
X
ri
i=1
is a minimum. Let = r22 to simplify the notation.
Find and by minimizing = (, ). The minimum requires
=0
=constant
and
NMM: Least Squares Curve-Fitting
=0
=constant
page 8
Least Squares Fit
(2)
Carrying out the dierentiation leads to
Sxx + Sx
Sxy
(1)
Sx + m
Sy
(2)
where
Sxx =
Sxy =
m
X
i=1
m
X
i=1
xixi
xiyi
Sx =
Sy =
m
X
i=1
m
X
xi
yi
i=1
Note: Sxx, Sx, Sxy , and Syy can be directly computed from the given (xi, yi) data.
Thus, Equation (1) and (2) are two equations for the two unknowns, and .
NMM: Least Squares Curve-Fitting
page 9
Least Squares Fit
(3)
Solving equations (1) and (2) for and yields
1
(SxSy mSxy )
d
1
(SxSxy SxxSy )
d
(3)
(4)
with
2
d = Sx mSxx
NMM: Least Squares Curve-Fitting
(5)
page 10
Overdetermined System for a Line Fit
(1)
Now, lets rederive the equations for the t. This will give us insight into the process or
tting arbitrary linear combinations of functions.
For any two points we can write
x1 + = y1
x2 + = y2
or
x1
x2
1
1
y1
=
y2
But why just pick two points?
NMM: Least Squares Curve-Fitting
page 11
Overdetermined System for a Line Fit
(2)
Writing out the x + = y equation for all of the known points (xi, yi),
i = 1, . . . , m gives the overdetermined system.
2 3
3
2
y1
x1 1
6 y2 7
6 x2 17
7
6 .
7
or
Ac = y
... 5 = 6
4 ... 5
4 ..
xm 1
ym
where
x1
6 x2
A=6
4 ...
xm
3
1
17
... 7
5
1
c=
3
y1
6 y2 7
7
y=6
4 ... 5
ym
Note: We cannot solve Ac = y with Gaussian elimination. Unless the system is consistent (i.e., unless
y lies in the column space of A) it is impossible to nd the c = (, )T that exactly satises
all m equations. The system is consistent only if all the data points lie along a single line.
NMM: Least Squares Curve-Fitting
page 12
Normal Equations for a Line Fit
Compute = r22, where r = y Ac
2
= r2 = r r = (y Ac) (y Ac)
T
= y y (Ac) y y (Ac) + c A Ac
T
= y y 2y Ac + c A Ac.
Minimizing requires
or
T
T
= 2A y + 2A Ac = 0
c
T
(A A)c = A b
This is the matrix formulation of equations (1) and (2).
NMM: Least Squares Curve-Fitting
page 13
linefit.m
The linefit function ts a line to a set of data by solving the normal equations.
function [c,R2] = linefit(x,y)
% linefit
Least-squares fit of data to y = c(1)*x + c(2)
%
% Synopsis:
c
= linefit(x,y)
%
[c,R2] = linefit(x,y)
%
% Input:
x,y = vectors of independent and dependent variables
%
% Output: c = vector of slope, c(1), and intercept, c(2) of least sq. line fit
%
R2 = (optional) coefficient of determination; 0 <= R2 <= 1
%
R2 close to 1 indicates a strong relationship between y and x
if length(y)~= length(x), error(x and y are not compatible); end
x = x(:); y = y(:);
% Make sure that x and y are column vectors
A = [x ones(size(x))]; % m-by-n matrix of overdetermined system
c = (A*A)\(A*y);
% Solve normal equations
if nargout>1
r = y - A*c;
R2 = 1 - (norm(r)/norm(y-mean(y)))^2;
end
NMM: Least Squares Curve-Fitting
page 14
Line Fitting Example
Store data and perform the t
Evaluate and plot the t
>> xfit = linspace(min(x),max(x));
>> yfit = c(1)*xfit + c(2)
>> plot(x,y,o,xfit,yfit,-);
3.5
3
y data and fit function
>> x = [1 2 4 5];
>> y = [1 2 2 3];
>> c = linefit(x,y)
c =
0.4000
0.8000
2.5
2
1.5
1
0.5
0
0
NMM: Least Squares Curve-Fitting
3
x
page 15
R2 Statistic
(1)
R2 is a measure of how well the t function follows the trend in the data. 0 R2 1.
Define:
y
y
Then:
is the value of the t function at the
known data points.
is the average of the y values
For a line t
yi = c1xi + c2
1 X
y =
yi
m
X
2
)
(
y
i
r22
2
R =X
=1P
2
(yi y)2
(yi y)
When R2 1 the t function follows the trend of the data.
When R2 0 the t is not signicantly better than approximating the data by its mean.
NMM: Least Squares Curve-Fitting
page 16
Graphical Interpretation of the R2 Statistic
r22
Consider a line t to a data set with R = 1 P
= 0.934
(yi y)2
2
Vertical distances between given y
data and the least squares line t.
Vertical lines show contributions to
r2.
5
y
4
3
2
1
Vertical distances between given y
data and the average of the y .
Vertical lines show contributions to
P
(yi y)2
5
y
4
3
2
1
NMM: Least Squares Curve-Fitting
page 17
R2 Statistic: Example Calculation
T (C)
G (GP a)
20 500
1000
1200
1400
1500
203 197
191
188
186
184
>> [t,D,labels] = loadColData(SiC.dat,6,5);
>> g = D(:,1);
>> [c,R2] = linefit(t,g);
c =
-0.0126
203.3319
R2 =
0.9985
NMM: Least Squares Curve-Fitting
GPa
205
Bulk Modulus
Consider the variation of the bulk modulus
of Silicon Carbide as a function of
temperature (Cf. Example 9.4)
200
195
190
185
180
500
1000
1500
T C
page 18
Fitting Transformed Non-linear Functions
(1)
Some nonlinear t functions y = F (x) can be transformed to an equation of the
form v = u +
Linear least squares t to a line is performed on the transformed variables.
Parameters of the nonlinear t function are obtained by transforming back to the
original variables.
The linear least squares t to the transformed equations does not yield the same t
coecients as a direct solution to the nonlinear least squares problem involving the
original t function.
Examples:
y = c1ec2x
ln y = x +
y = c1xc2
ln y = ln x +
y = c1xec2x
ln(y/x) = x +
NMM: Least Squares Curve-Fitting
page 19
Fitting Transformed Non-linear Functions
(2)
Consider
y = c1e
Taking the logarithm of both sides yields
c2 x
(6)
ln y = ln c1 + c2x
Introducing the variables
v = ln y
b = ln c1
a = c2
transforms equation (6) to
v = ax + b
NMM: Least Squares Curve-Fitting
page 20
Fitting Transformed Non-linear Functions
(3)
The preceding steps are equivalent to graphically obtaining c1 and c2 by plotting the data
on semilog paper.
y = c1ec2x
ln y = c2x + ln c1
1
10
4.5
4
3.5
10
2.5
2
1
10
1.5
1
0.5
0
0
0.5
NMM: Least Squares Curve-Fitting
1
x
1.5
10
0.5
1
x
1.5
page 21
Fitting Transformed Non-linear Functions
(4)
Consider y = c1xc2 . Taking the logarithm of both sides yields
ln y = ln c1 + c2 ln x
(7)
Introduce the transformed variables
v = ln y
u = ln x
b = ln c1
a = c2
and equation (7) can be written
v = au + b
NMM: Least Squares Curve-Fitting
page 22
Fitting Transformed Non-linear Functions
(5)
The preceding steps are equivalent to graphically obtaining c1 and c2 by plotting the data
on log-log paper.
y = c1xc2
ln y = c2 ln x + ln c1
2
10
14
12
10
8
1
10
0
0
0.5
NMM: Least Squares Curve-Fitting
1
x
1.5
10 2
10
10
10
10
page 23
Example: Fitting Data to y = c1xec2x
Consider y = c1xec2x. The transformation
y
v = ln
a = c2
x
b = ln c1
results in the linear equation
v = ax + b
NMM: Least Squares Curve-Fitting
page 24
Fitting Transformed Non-linear Functions
(6)
The preceding steps are equivalent to graphically obtaining c1 and c2 by plotting the data
on semilog paper.
y = c1xec2x
ln(y/x) = c2x + ln c1
1
0.7
10
0.6
0.5
10
0.4
0.3
1
10
0.2
0.1
0
0
0.5
NMM: Least Squares Curve-Fitting
1
x
1.5
10
0.5
1
x
1.5
page 25
xexpfit.m
The xexpfit function uses a linearizing transformation to t y = c1xec2x to data.
function c =
% xexpfit
%
% Synopsis:
%
% Input:
%
% Output:
xexpfit(x,y)
Least squares fit of data to y = c(1)*x*exp(c(2)*x)
c = xexpfit(x,y)
x,y = vectors of independent and dependent variable values
c = vector of coefficients of
z = log(y./x);
c = linefit(x,z);
c = [exp(c(2)); c(1)];
NMM: Least Squares Curve-Fitting
%
%
%
y = c(1)*x*exp(c(2)*x)
Natural log of element-by-element division
Fit is performed by linefit
Extract parameters from transformation
page 26
Example: Fit Synthetic Data
0.7
Fit y = c1
demoXexp
xec2x
to synthetic data. See
original
noisy
fit
0.6
0.5
NMM: Least Squares Curve-Fitting
c1 = 5.770 c2 = -3.233
0.4
200 points in synthetic data set
>> % Synthetic data with noise, avoid x=0
>> x0 = 0.01;
>> noise = 0.05;
>> x = linspace(x0,2,200);
>> y = 5*x.*exp(-3*x);
>> yn = y + noise*(rand(size(x))-0.5);
>> % Guarantee yn>0 for log(yn)
>> yn = abs(yn);
>> c = xexpfit(x,yn);
c =
5.7701
-3.2330
0.3
0.2
0.1
0
0
0.5
1
x
1.5
page 27
Summary of Transformations
Transform (x, y) data as needed
Use linefit
Transform results of linefit back
>>
>>
>>
>>
>>
>>
x
y
u
v
a
c
=
=
=
=
=
=
...
...
...
...
linefit(u,v)
...
NMM: Least Squares Curve-Fitting
original data
transform the data
transform the coefficients
page 28
Summary of Line Fitting
(1)
1. m data pairs are given: (xi, yi), i = 1, . . . , m.
2. The t function y = F (x) = c1x + c2 has n = 2 basis functions f1(x) = x and
f2(x) = 1.
3. Evaluating the t function for each of the m data points gives an overdetermined
system of equations Ac = y where c = [c1, c2]T , y = [y1, y2, . . . , ym]T , and
f1(x1)
6 f1(x2)
A=6
4 ...
f1(xm)
NMM: Least Squares Curve-Fitting
2
3
x1
f2(x1)
6 x2
f2(x2) 7
6 .
7
=
... 5
4 ..
f2(xm)
xm
3
1
17
... 7
5.
1
page 29
Summary of Line Fitting
(2)
4. The least-squares principle denes the best t as the values of c1 and c2 that minimize
2
(c1, c2) = y F (x)2 = y Ac2.
5. Minimizing of (c1, c2) leads to the normal equations
T
(A A)c = A y,
6. Solving the normal equations gives the slope c1 and intercept c2 of the best t line.
NMM: Least Squares Curve-Fitting
page 30
Fitting Linear Combinations of Functions
Denition of t function and basis functions
Formulation of the overdetermined system
Solution via normal equations: fitnorm
Solution via QR factorization: fitqr and \
NMM: Least Squares Curve-Fitting
page 31
Fitting Linear Combinations of Functions
(1)
Consider the tting function
F (x) = cf1(x) + c2f2(x) + . . . + cnfk (x)
or
F (x) =
n
X
cj fj (x)
j=1
The basis functions
f1(x), f2(x), . . . , fn(x)
are chosen by you the person making the t.
The coecients
c1, c2, . . . , cn
are determined by the least squares method.
NMM: Least Squares Curve-Fitting
page 32
Fitting Linear Combinations of Functions
(2)
F (x) function can be any combination of functions that are linear in the cj . Thus
2
2/3
1, x, x , x
4x
, sin x, e , xe , cos(ln 25x)
are all valid basis functions. On the other hand,
sin(c1x), e
c3 x
,
c2
are not valid basis functions as long as the cj are the parameters of the t.
For example, the t function for a cubic polynomial is
3
F (x) = c1x + c2x + c3x + c4,
which has the basis functions
3
x , x , x, 1.
NMM: Least Squares Curve-Fitting
page 33
Fitting Linear Combinations of Functions
(3)
The objective is to nd the cj such that F (xi) yi.
Since F (xi) = yi, the residual for each data point is
ri = yi F (xi) = yi
n
X
cj fj (xi)
j=1
The least-squares solution gives the cj that minimize r2.
NMM: Least Squares Curve-Fitting
page 34
Fitting Linear Combinations of Functions
(4)
Consider the t function with three basis functions
y = F (x) = c1f1(x) + c2f2(x) + c3f3(x).
Assume that F (x) acts like an interpolant. Then
c1f1(x1) + c2f2(x1) + c3f3(x1) = y1,
c1f1(x2) + c2f2(x2) + c3f3(x2) = y2,
...
c1f1(xm) + c2f2(xm) + c3f3(xm) = ym.
are all satised.
For a least squares t, the equations are not all satised, i.e., the t function F (x) does
not pass through the yi data.
NMM: Least Squares Curve-Fitting
page 35
Fitting Linear Combinations of Functions
(5)
The preceding equations are equivalent to the overdetermined system
Ac = y,
where
f1(x1)
6 f1(x2)
A=6
4 ...
f1(xm)
2 3
c1
c = 4c25 ,
c3
NMM: Least Squares Curve-Fitting
f2(x1)
f2(x2)
...
f2(xm)
3
f3(x1)
f3(x2) 7
... 7
5,
f3(xm)
3
y1
6 y2 7
7
y=6
4 ... 5 .
ym
2
page 36
Fitting Linear Combinations of Functions
(6)
If F (x) cannot interpolate the data, then the preceding matrix equation cannot be solved
exactly: b does not lie in the column space of A.
The least-squares method provides the compromise solution that minimizes
r2 = y Ac2.
The c that minimizes r2 satises the normal equations
T
(A A)c = A y.
NMM: Least Squares Curve-Fitting
page 37
Fitting Linear Combinations of Functions
(7)
In general, for n basis functions
f1(x1)
6 f1(x2)
A=6
4 ...
f1(xm)
2
f2(x1)
f2(x2)
...
f2(xm)
3
c1
6 c2 7
7
c=6
4 ... 5 ,
cn
NMM: Least Squares Curve-Fitting
...
...
...
3
fn(x1)
fn(x2) 7
7,
...
5
fn(xm)
3
y1
6 y2 7
7
y=6
4 ... 5 .
ym
page 38
Example: Fit a Parabola to Six Points
12
Consider tting a curve to the following
data.
10
1
2
3
4
5
6
10 5.49 0.89 0.14 1.07 0.84
Not knowing anything more about the
data we can start by tting a polynomial
to the data.
6
y
x
y
(1)
NMM: Least Squares Curve-Fitting
page 39
Example: Fit a Parabola to Six Points
(2)
The equation of a second order polynomial can be written
2
y = c1x + c2x + c3
where the ci are the coecients to be determined by the t and the basis functions are
2
f1(x) = x ,
The A matrix is
f2(x) = x,
2
x21
6 x2
2
A=6
4 ...
x2m
x1
x2
...
xm
f3(x) = 1
3
1
17
... 7
5
1
where, for this data set, m = 6.
NMM: Least Squares Curve-Fitting
page 40
Example: Fit a Parabola to Six Points
(3)
Dene the data
>> x = (1:6);
>> y = [10 5.49
0.89
-0.14
-1.07
0.84];
Notice the transposes, x and y must be column vectors.
The coecient matrix of the overdetermined system is
>> A = [ x.^2
ones(size(x)) ];
The coecient matrix for the normal equations is
>> disp(A*A)
2275
441
91
NMM: Least Squares Curve-Fitting
441
91
21
91
21
6
page 41
Example: Fit a Parabola to Six Points
(4)
The right-hand-side vector for the normal equations is
>> disp(A*y)
A*y =
41.2200
22.7800
16.0100
Solve the normal equations
>> c = (A*A)\(A*y)
c =
0.8354
-7.7478
17.1160
NMM: Least Squares Curve-Fitting
page 42
Example: Fit a Parabola to Six Points
(5)
F(x) = c1 x2 + c2 x + c3
12
Evaluate and plot the t
10
>> xfit = linspace(min(x),max(x));
>> yfit = c(1)*xfit.^2 + c(2)*xfit + c(3);
8
>> plot(x,y,o,xfit,yfit,--);
NMM: Least Squares Curve-Fitting
page 43
Example: Alternate Fit to Same Six Points
Fit the same points to
(1)
F(x) = c1/x + c2 x
12
c1
F (x) =
+ c2x
x
10
The basis functions are
1
,
x
In Matlab:
>>
>>
>>
>>
x
y
A
c
=
=
=
=
(1:6);
[10 5.49 0.89 -0.14 -1.07 0.84];
[1./x x];
(A*A)\(A*y)
NMM: Least Squares Curve-Fitting
page 44
Evaluating the Fit Function as a MatrixVector Product
(1)
We have been writing the t function as
y = F (x) = c1f1(x) + c2f2(x) + + cnfn(x)
The overdetermined coecient matrix contains the basis
known data
2
f1(x1) f2(x1) . . .
6 f1(x2) f2(x2) . . .
A=6
...
4 ...
f1(xm) f2(xm) . . .
Thus, if A is available
F (x) = Ac
functions evaluated at the
3
fn(x1)
fn(x2) 7
7
...
5
fn(xm)
evaluates F (x) at all values of x, i.e., F (x) is a vector-valued function.
NMM: Least Squares Curve-Fitting
page 45
Evaluating the Fit Function as a MatrixVector Product
(2)
Evaluating the t function as a matrixvector product can be performed for any x.
Suppose then that we have created an m-le function that evaluates A for any x, for
example
function A = xinvxfun(x)
A = [ 1./x(:) x(:) ];
We evaluate the t coecients with
>> x = ..., y = ...
>> c = fitnorm(x,y,xinvxfun);
Then, to plot the t function after the coecients of the t
>>
>>
>>
>>
xfit = linspace(min(x),max(x));
Afit = xinvxfun(xfit);
yfit = Afit*c;
plot(x,y,o,xfit,yfit,--)
NMM: Least Squares Curve-Fitting
page 46
Evaluating the Fit Function as a MatrixVector Product
(3)
Advantages:
The basis functions are dened in only one place: in the routine for evaluating the
overdetermined matrix.
Automation of tting and plotting is easier because all that is needed is one routine
for evaluating the basis functions.
End-user of the t (not the person performing the t) can still evaluate the t
function as y = c1f1(x) + c2f2(x) + + cnfn(x).
Disadvantages:
Storage of matrix A for large x vectors consumes memory. This should not be a
problem for small n.
Evaluation of the t function may not be obvious to a reader unfamiliar with linear
algebra.
NMM: Least Squares Curve-Fitting
page 47
Matlab Implementation in fitnorm
Let A be the m n matrix dened by
...
A = 4f1(x)
...
...
f2(x)
...
...
... 3
fn(x)5
...
The columns of A are the basis functions evaluated at each of the x data points.
As before, the normal equations are
T
A Ac = A y
The user supplies a (usually small) m-le that returns A.
NMM: Least Squares Curve-Fitting
page 48
fitnorm.m
function [c,R2,rout] = fitnorm(x,y,basefun)
% fitnorm
Least-squares fit via solution to the normal equations
%
% Synopsis: c
= fitnorm(x,y,basefun)
%
[c,R2]
= fitnorm(x,y,basefun)
%
[c,R2,r] = fitnorm(x,y,basefun)
%
% Input:
x,y
= vectors of data to be fit
%
basefun = (string) m-file that computes matrix A with columns as
%
values of basis basis functions evaluated at x data points.
%
% Output: c = vector of coefficients obtained from the fit
%
R2 = (optional) adjusted coefficient of determination; 0 <= R2 <= 1
%
r = (optional) residuals of the fit
if length(y)~= length(x); error(x and y are not compatible); end
A = feval(basefun,x(:)); % Coefficient matrix of overdetermined system
c = (A*A)\(A*y(:));
% Solve normal equations, y(:) is always a column
if nargout>1
r = y - A*c;
% Residuals at data points used to obtain the fit
[m,n] = size(A);
R2 = 1 - (m-1)/(m-n-1)*(norm(r)/norm(y-mean(y)))^2;
if nargout>2, rout = r; end
end
NMM: Least Squares Curve-Fitting
page 49
Example of User-Supplied m-files
The basis functions for tting a parabola are
2
f1(x) = x ,
f2(x) = x,
f3(x) = 1
Create the m-le poly2Basis.m:
function A = poly2Basis(x)
A = [ x(:).^2 x(:) ones(size(x(:)) ];
then at the command prompt
>> x = ...; y = ...;
>> c = fitnorm(x,y,poly2Basis)
or use an in-line function object:
>> x = ...; y = ...;
>> Afun = inline([ x(:).^2
>> c = fitnorm(x,y,Afun);
NMM: Least Squares Curve-Fitting
x(:)
ones(size(x(:)) ]);
page 50
Example of User-Supplied m-files
To the basis functions for tting F (x) = c1/x + c2x are
1
,
x
Create the m-le xinvxfun.m
function A = xinvxfun(x)
A = [ 1./x(:) x(:) ];
then at the command prompt
>> x = ...; y = ...;
>> c = fitnorm(x,y,xinvxfun)
or use an in-line function object:
>> x = ...; y = ...;
>> Afun = inline([ 1./x(:)
>> c = fitnorm(x,y,Afun);
NMM: Least Squares Curve-Fitting
x(:) ]);
page 51
R2 Statistic
(1)
R2 can be applied to linear combinations of basis functions.
Recall that for a line t (Cf. 9.1.4.)
X
2
)
(
y
i
r22
2
R =X
=1P
2
(yi y)2
(yi y)
where yi is the value of the t function evaluated at xi, and y is the average of the
(known) y values.
For a linear combination of basis functions
yi =
n
X
cj fj (xi)
j=1
NMM: Least Squares Curve-Fitting
page 52
R2 Statistic
(2)
To account for the reduction in degrees of freedom in the data when the t is performed,
it is technically appropriate to consider the adjusted coecient of determination
Radjusted = 1
P
(yi y)2
m1
,
P
(yi y)2
mn1
2
fitnorm provides the option of computing Radjusted
NMM: Least Squares Curve-Fitting
page 53
Polynomial Curve Fits with polyfit
(1)
Built-in commands for polynomial curve ts:
polyfit
Obtain coecients of a least squares curve t
of a polynomial to a given data set
polyval
Evaluate a polynomial at a given set of x values.
NMM: Least Squares Curve-Fitting
page 54
Polynomial Curve Fits with polyfit
(2)
Syntax:
c = polyfit(x,y,n)
[c,S] = polyfit(x,y,n)
x and y dene the data
n is the desired degree of the polynomial.
c is a vector of polynomial coecients stored in order of descending powers of x
n
n1
p(x) = c1x + c2x
+ + cnx + cn+1
S is an optional return argument for polyfit. S is used as input to polyval
NMM: Least Squares Curve-Fitting
page 55
Polynomial Curve Fits with polyfit
(3)
Evaluate the polynomial with polyval
Syntax:
yf = polyval(c,xf)
[yf,dy] = polyval(c,xf,S)
c contains the coecients of the polynomial (returned by polyfit)
xf is a scalar or vector of x values at which the polynomial is to be evaluated
yf is a scalar or vector of values of the polynomials: yf= p(xf).
If S is given as an optional input to polyval, then dy is a vector of estimates of the
uncertainty in yf
NMM: Least Squares Curve-Fitting
page 56
Example: Polynomial Curve Fit
(1)
Fit a polynomial to Consider tting a curve to the following data.
x
y
1
10
2
5.49
3
0.89
4
0.14
5
1.07
6
0.84
In Matlab:
>>
>>
>>
>>
>>
>>
x = (1:6);
y = [10 5.49 0.89 -0.14 -1.07
c = polyfit(x,y,3);
xfit = linspace(min(x),max(x));
yfit = polyval(c,xfit);
plot(x,y,o,xfit,yfit,--)
NMM: Least Squares Curve-Fitting
0.84];
page 57
12
10
2
1
NMM: Least Squares Curve-Fitting
page 58
Example: Conductivity of Copper Near 0 K
(1)
35
Conductivity (W/m/C)
30
25
20
15
10
5
0
0
NMM: Least Squares Curve-Fitting
10
20
30
40
Temperature (K)
50
60
page 59
Example: Conductivity of Copper Near 0 K
(2)
Theoretical model of conductivity is
k(T ) =
1
c1
2
+ c2T
T
To t using linear least squares we need to write this as
(T ) =
which has the basis functions
NMM: Least Squares Curve-Fitting
1
c1
2
=
+ c2T
k(T )
T
1
,
T
page 60
Example: Conductivity of Copper Near 0 K
(3)
The m-le implementing these basis functions is
function y = cuconBasis1(x)
% cuconBasis1 Basis fcns for conductivity model:
y = [1./x x.^2];
1/k = c1/T + c2*T^2
An m-le that uses fitnorm to t the conductivity data with the cuconBasis1 function
is listed on the next page.
NMM: Least Squares Curve-Fitting
page 61
Example: Conductivity of Copper Near 0 K
(4)
function conductFit(fname)
% conductFit LS fit of conductivity data for Copper at low temperatures
%
% Synopsis: conductFit(fname)
%
% Input: fname
= (optional, string) name of data file;
%
Default: fname = conduct1.dat
%
% Output: Print out of curve fit coefficients and a plot comparing data
%
with the curve fit for two sets of basis functions.
if nargin<1,
fname = cucon1.dat;
end
Default data file
% --- define basis functions as inline function objects
fun1 = inline([1./t t.^2]);
% t must be a column vector
fun2 = inline([1./t t t.^2]);
% --- read data and perform the fit
[t,k] = loadColData(fname,2,0,2);
[c1,R21,r1] = fitnorm(t,1./k,fun1);
[c2,R22,r2] = fitnorm(t,1./k,fun2);
NMM: Least Squares Curve-Fitting
%
%
%
Read data into t and k
Fit to first set of bases
and second set of bases
page 62
% --- print results
fprintf(\nCurve fit to data in %s\n\n,fname);
fprintf( Coefficients of
Basis Fcns 1
Basis Fcns 2\n);
fprintf(
T^(-1)
%16.9e
%16.9e\n,c1(1),c2(1));
fprintf(
T
%16.9e
%16.9e\n,0,c2(2));
fprintf(
T^2
%16.9e
%16.9e\n,c1(2),c2(3));
fprintf(\n
||r||_2
%12.5f
%12.5f\n,norm(r1),norm(r2));
fprintf(
R2
%12.5f
%12.5f\n,R21,R22);
% --- evaluate and plot the fits
tf = linspace(0.1,max(t));
% 100 T values: 0 < t <= max(t)
Af1 = feval(fun1,tf);
% A matrix evaluated at tf values
kf1 = 1./ (Af1*c1);
% Af*c is column vector of 1/kf values
Af2 = feval(fun2,tf);
kf2 = 1./ (Af2*c2);
plot(t,k,o,tf,kf1,--,tf,kf2,-);
xlabel(Temperature (K));
ylabel(Conductivity (W/m/C));
legend(data,basis 1,basis 2);
NMM: Least Squares Curve-Fitting
page 63