Path: blob/master/2020-spring/materials/worksheet_09/worksheet_09.ipynb
2051 views
Worksheet 9 - Regression Continued
Lecture and Tutorial Learning Goals:
By the end of the week, students will be able to:
Perform ordinary least squares regression in R using caret’s train with
method = "lm"
to predict the values for a test dataset.Compare and contrast predictions obtained from k-nearest neighbour regression to those obtained using simple ordinary least squares regression from the same dataset.
In R, overlay the ordinary least squares regression lines from geom_smooth on a single plot.
Warm-up Questions
Here are some warm-up questions on the topic of multivariate regression to get you thinking before we jump into data analysis. The course readings should help you answer these.
Question 1.0 Multiple Choice:
{points: 1}
In multivariate k-nn regression with one outcome/target variable and two predictor variables, the predictions take the form of what shape?
A. a flat plane
B. a wiggly/flexible plane
C. A straight line
D. a wiggly/flexible line
E. a 4D hyperplane
F. a 4D wiggly/flexible hyperplane
Save the letter of the answer you think is correct to a variable named answer1.0
. Make sure you put quotations around the letter and pay attention to case.
Question 1.1 Multiple Choice:
{points: 1}
In simple linear regression with one outcome/target variable and one predictor variable, the predictions take the form of what shape?
A. a flat plane
B. a wiggly/flexible plane
C. A straight line
D. a wiggly/flexible line
E. a 4D hyperplane
F. a 4D wiggly/flexible hyperplane
Save the letter of the answer you think is correct to a variable named answer1.1
. Make sure you put quotations around the letter and pay attention to case.
Question 1.2 Multiple Choice:
{points: 1}
In multivariate linear regression with one outcome/target variable and two predictor variables, the predictions take the form of what shape?
A. a flat plane
B. a wiggly/flexible plane
C. A straight line
D. a wiggly/flexible line
E. a 4D hyperplane
F. a 4D wiggly/flexible hyperplane
Save the letter of the answer you think is correct to a variable named answer1.2
. Make sure you put quotations around the letter and pay attention to case.
Understanding Simple Linear Regression
Consider this small and simple data set:
Now consider these three potential lines of best fit for the same data set:
Question 2.0
{points: 1}
Use the graph below titled "Line A" to roughly calculate the average squared vertical distance between the points and the blue line. Read values of the graph to a precision of 0.25 (e.g. 1, 1.25, 1.5, 1.75, 2). Save your answer to a variable named answer2.0
.
We re-reprint the plot for you in a larger size to make it easier to estimate the locations on the graph.
Question 2.1
{points: 1}
Use the graph titled "Line B" to roughly calculate the average squared vertical distance between the points and the purple line. Read values of the graph to a precision of 0.25 (e.g. 1, 1.25, 1.5, 1.75, 2). Save your answer to a variable named answer2.1
.
We re-reprint the plot for you in a larger size to make it easier to estimate the locations on the graph.
Question 2.2
{points: 1}
Use the graph titled "Line C" to roughly calculate the average squared vertical distance between the points and the green line. Read values of the graph to a precision of 0.25 (e.g. 1, 1.25, 1.5, 1.75, 2). Save your answer to a variable named answer2.2
.
We re-reprint the plot for you in a larger size to make it easier to estimate the locations on the graph.
Question 2.3
{points: 1}
Based on your calculations above, which line would linear regression by ordinary least squares choose given our small and simple data set? Line A, B or C? Assign the letter that corresponds the line to a variable named answer2.3
. Make sure you put quotations around the letter and pay attention to case.
Marathon Training Revisited with Linear Regression!
Source: https://media.giphy.com/media/BDagLpxFIm3SM/giphy.gif
Remeber our question from last week: what predicts which athletes will perform better than others? Specifically, we are interested in marathon runners, and looking at how the maximum distance ran per week during training predicts the time it takes a runner to end the race?
This time around however we will analyze the data using a simple linear regression. And then in the end we will compare our results to what we found last week with k-nn regression.
Question 3.0
{points: 1}
Load the data and assign it to an object called marathon
.
Question 3.1
{points: 1}
Create a training and testing dataset using 75% of the data as training data. Use set.seed(2000)
and the max
column as the input to createDataPartition
(as we did in the last worksheet) so that we end up with the same training data set for simple linear regression that we had for k-nn regression (so we can compare our results between these two weeks).
At the end of this question you should have 4 objects named X_train
, Y_train
, X_test
and Y_test
.
Question 3.2
{points: 1}
Using only the training observations in the data set, create a scatterplot to assess the relationship between race time (time_hrs
) given a particular value of maximum distance ran per week during training (max
). Put time_hrs
on the y-axis and max
on the x-axis. Assign this plot to an object called marathon_eda
. Remember to do whatever is necessary to make this an effective visualization.
Question 3.3
{points: 1}
Now use caret
's train
function with method = "lm"
to fit your simple linear regression model. Name your simple linear regression model object lm_model
.
Question 3.4
{points: 1}
Now, let's visualize the model predictions as a straight line overlaid on the training data. Use geom_smooth
with method = "lm"
and se = FALSE
to visualize the predictions as a straight line. Name your plot lm_predictions
.
Question 3.5
{points: 1}
Calculate the to assess goodness of fit on your lm_model
(remember this is how well it predicts on the training data used to fit the model). Return a single numerical value named lm_rmse
.
Question 3.6
{points: 1}
Calculate using the test data. Return a single numerical value named lm_rmspe
.
Question 3.61
{points: 1}
Now, let's visualize the model predictions as a straight line overlaid on the test data. Use geom_smooth
with method = "lm"
and se = FALSE
to visualize the predictions as a straight line. Name your plot lm_predictions_test
. Remember to do whatever is necessary to make this an effective visualization.
Question 3.7
{points: 1}
Compare the test RMPSE of k-nn regression (from last worksheet) to that of simple linear regression, which is greater?
A. Simple linear regression has a greater RMSPE
B. k-nn regression has a greater RMSPE
C. Neither, they are identical
Save the letter of the answer you think is correct to a variable named answer3.7
. Make sure you put quotations around the letter and pay attention to case.
Question 3.8 Multiple Choice:
{points: 1}
Which model does a better job of predicting on the test data set?
A. Simple linear regression
B. k-nn regression
C. Neither, they are identical
Save the letter of the answer you think is correct to a variable named answer3.8
. Make sure you put quotations around the letter and pay attention to case.
Question 3.9
(optional - not graded)
Given that the linear regression model is a straight line, we can write our model as a mathematical equation. We can get the two numbers we need for this (y-intercept and slope) from the finalModel
attribute from our model object as shown below:
Use the numbers output in the cell above to write the model as a mathematical equation.
DOUBLE CLICK TO EDIT THIS CELL AND REPLACE THIS TEXT WITH YOUR ANSWER.