Fitting a parallel slopes linear regression
In Introduction to Regression with statsmodels in Python, you learned to fit linear regression models with a single explanatory variable. In many cases, using only one explanatory variable limits the accuracy of predictions. To truly master linear regression, you need to be able to fit regression models with multiple explanatory variables.
The case when there is one numeric explanatory variable and one categorical explanatory variable is sometimes called a "parallel slopes" linear regression due to the shape of the predictions — more on that in the next exercise.
Here, you'll revisit the Taiwan real estate dataset. Recall the meaning of each variable.
Variable | Meaning |
---|---|
dist_to_mrt_station_m |
Distance to nearest MRT metro station, in meters. |
n_convenience |
No. of convenience stores in walking distance. |
house_age_years |
The age of the house, in years, in 3 groups. |
price_twd_msq |
House price per unit area, in New Taiwan dollars per meter squared. |
taiwan_real_estate
is available.
This exercise is part of the course
Intermediate Regression with statsmodels in Python
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Import ols from statsmodels.formula.api
____
# Fit a linear regression of price_twd_msq vs. n_convenience
mdl_price_vs_conv = ____
# Print the coefficients
print(____)