Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Numerical Method Derive the regression formula for high-order polynomial models

ID: 3544612 • Letter: N

Question

Numerical Method


Derive the regression formula for high-order polynomial models and submit it as ME2016_Homework5_YourLastName_FirstName.docx (or .pdf, .jpg). To build a regression model y,(x)=a0+a1x+a2x2 with one independent variable x from a set of n data points by the least-square error method, derive step-by-step that the coefficients a0, a1, a2 can be computed by solving the linear equation of matrix form To build a regression model y(x)=a0+a1x+a2x2+a3x3 with one independent variable x from a set of n data points by the least-square error method, derive step-by-step that the coefficients a0, a1, a2, a3 can be computed by solving the linear equation of matrix form:

Explanation / Answer

1.1

let X be the following matrix :

[ 1 x1 x1^2]

[ 1 x2 x2^2]

[ : : : : : ]

[ 1 xn xn^2]


Y be the vector = transpose of [y1 y2 ... yn]

Let a be the vector = transpose of [ a0 a1 a2]


Clearly the sum of squares error matrix would be:

Z = (Y - a*X)^(t)(Y - a*X) [ ^(t) denotes transpose]

we need to minimise Z w.r.t to vector 'a'

dZ/da = -X^(t)*(Y - a*X) - (Y-a*X)^(t)*X = 0 => X^(t)*Y = (X^(t)*X)*a

thus we have 'a' as the solution of the following matrix equation:

(X^(t)*X)*a = X^(t)*Y

=> thus proved


1.2 exactly the same steps as above, since the above proof is generic