Path: blob/main/C1 - Supervised Machine Learning: Regression and Classification/week2/Optional Labs/C1_W2_Lab04_FeatEng_PolyReg_Soln.ipynb
3283 views
Optional Lab: Feature Engineering and Polynomial Regression
Goals
In this lab you will:
explore feature engineering and polynomial regression which allows you to use the machinery of linear regression to fit very complicated, even very non-linear functions.
Tools
You will utilize the function developed in previous labs as well as matplotlib and NumPy.
Feature Engineering and Polynomial Regression Overview
Out of the box, linear regression provides a means of building models of the form: What if your features/data are non-linear or are combinations of features? For example, Housing prices do not tend to be linear with living area but penalize very small or very large houses resulting in the curves shown in the graphic above. How can we use the machinery of linear regression to fit this curve? Recall, the 'machinery' we have is the ability to modify the parameters , in (1) to 'fit' the equation to the training data. However, no amount of adjusting of , in (1) will achieve a fit to a non-linear curve.
Polynomial Features
Above we were considering a scenario where the data was non-linear. Let's try using what we know so far to fit a non-linear curve. We'll start with a simple quadratic:
You're familiar with all the routines we're using. They are available in the lab_utils.py file for review. We'll use np.c_[..]
which is a NumPy routine to concatenate along the column boundary.
Well, as expected, not a great fit. What is needed is something like , or a polynomial feature. To accomplish this, you can modify the input data to engineer the needed features. If you swap the original data with a version that squares the value, then you can achieve . Let's try it. Swap X
for X**2
below:
Great! near perfect fit. Notice the values of and b printed right above the graph: w,b found by gradient descent: w: [1.], b: 0.0490
. Gradient descent modified our initial values of to be (1.0,0.049) or a model of , very close to our target of . If you ran it longer, it could be a better match.
Note the value of , [0.08 0.54 0.03]
and b is 0.0106
.This implies the model after fitting/training is: Gradient descent has emphasized the data that is the best fit to the data by increasing the term relative to the others. If you were to run for a very long time, it would continue to reduce the impact of the other terms.
Gradient descent is picking the 'correct' features for us by emphasizing its associated parameter
Let's review this idea:
Intially, the features were re-scaled so they are comparable to each other
less weight value implies less important/correct feature, and in extreme, when the weight becomes zero or very close to zero, the associated feature useful in fitting the model to the data.
above, after fitting, the weight associated with the feature is much larger than the weights for or as it is the most useful in fitting the data.
An Alternate View
Above, polynomial features were chosen based on how well they matched the target data. Another way to think about this is to note that we are still using linear regression once we have created new features. Given that, the best features will be linear relative to the target. This is best understood with an example.
Above, it is clear that the feature mapped against the target value is linear. Linear regression can then easily generate a model using that feature.
Scaling features
As described in the last lab, if the data set has features with significantly different scales, one should apply feature scaling to speed gradient descent. In the example above, there is , and which will naturally have very different scales. Let's apply Z-score normalization to our example.
Now we can try again with a more aggressive value of alpha:
Feature scaling allows this to converge much faster. Note again the values of . The term, which is the term is the most emphasized. Gradient descent has all but eliminated the term.
Complex Functions
With feature engineering, even quite complex functions can be modeled:
Congratulations!
In this lab you:
learned how linear regression can model complex, even highly non-linear functions using feature engineering
recognized that it is important to apply feature scaling when doing feature engineering