Fitting a parallel slopes linear regression
In Introduction to Regression in R, you learned to fit linear regression models with a single explanatory variable. In many cases, using only one explanatory variable limits the accuracy of predictions. That means that to truly master linear regression, you need to be able to include multiple explanatory variables.
The case when there is one numeric explanatory variable and one categorical explanatory variable is sometimes called a "parallel slopes" linear regression due to the shape of the predictions—more on that in the next exercise.
Here, you'll revisit the Taiwan real estate dataset. Recall the meaning of each variable.
Variable | Meaning |
---|---|
dist_to_mrt_station_m |
Distance to nearest MRT metro station, in meters. |
n_convenience |
No. of convenience stores in walking distance. |
house_age_years |
The age of the house, in years, in 3 groups. |
price_twd_msq |
House price per unit area, in New Taiwan dollars per meter squared. |
taiwan_real_estate
is available.
This exercise is part of the course
Intermediate Regression in R
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Fit a linear regr'n of price_twd_msq vs. n_convenience
mdl_price_vs_conv <- ___
# See the result
mdl_price_vs_conv