Prediction function in case of linear regression with polynomial feature expansion

  1. We know that linear regression outputs a linear prediction function. When we use polynomial feature expansion with linear regression then our prediction function is of the form y = c + m_1*x_1 + m_2*x_2 + ... + m_n*x_n. How does such a linear function give us a non-linear curve as seen in the plot?

  2. In Scikit Learn, can we get the equation of the prediction function which is used by a fitted Linear Regression model ?

Imagine that the true generative model is y_true = 10 * x**2. A linear model cannot get this because the model at best does y_hat = coef * x. If we include in the data a new term x**2 then our model will associate a coefficient to it and we will be able to recover the generative model. We would have for instance something like y_hat = coef_1 * x + coef_2 * x**2.

Therefore, the model is still a linear combination of the input feature. However, we augmented the feature by adding a feature which is a non-linear original feature. Thus, we would not get anymore a line for a value of x because we also have a feature x**2 (and thus a bell in this example).

This is a linear model so it would be y_hat = intercept_ + coef_1 * x_1 + coef_2 * x_2 + .... The coefficient and intercept are the fitted parameters that you can get with model.intercept_ and model.coef_.