1. Bayesian inference, again!
You’ve done some Bayesian inference! Again, I suppose, because you already tried it last chapter. But this time you did it from the ground up in exhaustive detail.
2. What have we done montage 1
You started by specifying a generative model from scratch in R but then realized this was the same as
3. What have we done montage 2
the Binomial model. You specified
4. What have we done montage 3
a prior probability distribution over the underlying proportion of clicks representing prior information; it’s likely between 0% and 20%, but also that it is uncertain, it could be anything from 0% to 20%. Together the generative model and this prior resulted in
5. What have we done montage 4
a joint probability distribution over both the underlying proportion of clicks and how many visitors you would get. You collected some data and used this to
6. What have we done montage 5
condition the joint distribution, in other words, you used
7. What have we done montage 6
Bayesian inference. This allowed the model to learn about the underlying proportion of clicks and resulted in an updated posterior probability distribution. And, finally, as a bonus, we used this posterior as
8. What have we done montage 7
the prior for the next ad campaign and predicted how many visitors we would get if reran it. I hope you can see that if you collected even more data, you could continue repeating these steps, to learn more and more about the underlying proportion of clicks.
This is exactly how
9. prom_model result
prop_model from the first chapter worked; it used the same Bayesian model with the only difference that the prior was uniform from 0 to 100% and that the model was updated with one success or failure at a time. Taking a step back, what have we done? We have specified
10. What have we done 1
prior information,
11. What have we done 2
a generative model,
12. What have we done 3
and given some data we calculated the
13. What have we done 4
14. What have we done 5
updated probability of different parameter values. In the examples so far we’ve used a Binomial model with a single parameter, but the cool thing here is that the general method of Bayesian inference works for
15. What have we done 6
any generative model, with any number of parameters. That is, you can have any number of parameters and unknown values, that you plug into any generative model that you can implement, and the data can be multivariate or can consist of completely different data sets, and the Bayesian machinery that we used in the simple case works in the same way.
16. What bayes need 1
This is why Bayesian methods are so broadly used, in everything from hypothesis testing to machine learning: As long as you can come up with a generative model for a data analytical problem, Bayesian inference can always be used to fit this model and learn from data. Well, in theory at least, because in practice you also need to use a computational method that’s efficient enough. And the method we used in this chapter is straightforward, easy to understand, but also scales very badly when you have more data or more complicated models. So that’s actually the fourth
17. What bayes need 2
thing you need to do Bayesian inference. And we will talk about more computationally efficient methods to do Bayesian inference in chapter 4.
18. Next up: Why use Bayes?
In this chapter, we looked at how Bayesian inference works. Next up, in chapter 3, we’ll look at some reasons for why you would want to use it to do Bayesian data analysis.