What are orthogonal polynomial and how are used in linear regression models

Well, I am doing the Coursera StatsLearning course from Stanford and I didn´t understand the use of orthogonal polynomials in a linear regression model.

After much looking around on the web I have finally understood how all is connected.

In linear regression you try to find the coefficients \alpha_j that reduce the sum of squared erros from: y_i=\alpha_0+ \alpha_1 x_i+....+\alpha_j x_i^j where i spans to all the samples we have, and j spans the polynomial degree we are using to fit the data.

When we use orthogonal polynomial we use instead the following expression to fit the data:

y_i=\alpha_0+ \alpha_1 (a_{11} x_i+a_{10} )+\alpha_2 (a_{22} x_i^2+a_{21}x_i+a_{20})+....

where the polynomials: p_j(x) =a_{jj} x^j+...a_{j1}x+a_{j0} are orthogonal to each other. Meaning by orthogonal that:

\sum\limits_{k=1}^N p_i (x_k) p_j (x_k) = 0; i\neq j; where N is the number of samples.

So, in the above sum the coefficients of the polynomials are chosen to make this sum equal to zero, and this is the polynomial provided by R using the poly function inside and lm expression.

I give a reference to the links I have used to clarify the topic:






Introduce tus datos o haz clic en un icono para iniciar sesión:

Logo de WordPress.com

Estás comentando usando tu cuenta de WordPress.com. Cerrar sesión / Cambiar )

Imagen de Twitter

Estás comentando usando tu cuenta de Twitter. Cerrar sesión / Cambiar )

Foto de Facebook

Estás comentando usando tu cuenta de Facebook. Cerrar sesión / Cambiar )

Google+ photo

Estás comentando usando tu cuenta de Google+. Cerrar sesión / Cambiar )

Conectando a %s