1. Congratulations!
Well done! You've learned multiple linear and logistic regression!
2. You learned things
In Chapter 1 you fitted, visualized, predicted and assessed parallel slopes linear regression models.
In Chapter 2 you explored interactions between explanatory variables, and tried to resolve Simpson's Paradox.
In Chapter 3 you saw that although visualization gets tricky with more explanatory variables, modeling easily handles them.
In Chapter 4 you ran logistic regression with multiple explanatory variables, and explored the logistic distribution.
On top of that, you implemented algorithms for linear regression and logistic regression. That's a really advanced thing, so well done!
3. There is more to learn
In order to master regression, there are a few more things to learn that we didn't have time for in this course.
Firstly, it is common practice to split your data into separate training and testing datasets. You calculate the model coefficients on the training set, then assess the performance of the model on the testing set. This helps prevent overfitting, where your model only works well for one specific dataset.
Secondly, an even stronger remedy against overfitting is to use cross-validation. This involves fitting and assessing the model several times, against several random training-testing splits to give a set of models from which to choose.
Thirdly, in order to help determine which explanatory variables should be included in the model, linear and logistic regression return the significance of each coefficient, which is a measure of whether the coefficient is really different from zero, or it just appears to be different from zero by chance.
4. Advanced regression
DataCamp has many more courses on advanced regression techniques. You're now ready to take them!
5. Have fun regressing!
Congratulations again!