The parts needed for Bayesian inference
1. The parts needed for Bayesian inference
In the last chapter, we ran a small but complete Bayesian data analysis from start to finish.2. Result of prop_model
And in this chapter, we are going to do it all over again. But this time we’ll go slowly, and the goal is for you to understand how Bayesian inference actually works. First, let’s look at all the parts that are needed to do3. Bayes needs this 0
Bayesian inference. So Bayesian inference was a method for learning about unknown quantities from data, for example model parameters or what future data could look like. It requires three things to work:4. Bayes needs this 1
Data,5. Bayes needs this 2
something called a generative model, and6. Bayes needs this 3
priors, what the model knows before seeing the data. So, I think you know what data is, but7. What is a generative model?
what is a generative model? Well, it’s a very general concept, it’s any kind of computer program, mathematical expression, or set of rules that you can feed fixed parameter values and that you can use to generate simulated data. As an example, let’s whip up8. Generative zombie drug model
a generative model for the zombie drug experiment, and let’s define it as a small program in R. There are many ways we could do this, but let’s go with something simple. First, as before,9. Generative zombie drug model
let’s assume that there is an underlying proportion of zombies that will be cured by the drug and that the drug is given to a number of zombies. These are the parameters of the model, and we could set them to anything, but for now,10. Generative zombie drug model
let’s just set prop_success to 0.15 and n_zombies to 13. Now we’re going to11. Generative zombie drug model
simulate data, so we’ll start by creating an empty data vector, and for each zombie we’re going to simulate a 1 if cured and a 0 if not cured. But how do we do this? The simple assumption I’m going to make is that whether a zombie gets cured only depends on the underlying proportion of success, and nothing else. Here that means we want to simulate a 1, a success, 15% of the time. We can do that by12. Generative zombie drug model
sampling a single number from a uniform probability distribution between 0 and 1, and if that is less than 0.15 we’ll call it a success.13. Generative zombie drug model
This gives us almost what we want, except that the simulated data is a vector of TRUEs and FALSEs, so the last step is to14. Generative zombie drug model
turn this into numbers instead. Alright, we have a generative model for curing zombies; every time we run it,15. Generative zombie drug model
it generates some simulated data for us.16. Generative zombie drug model
17. Generative zombie drug model
18. Generative zombie drug model
19. Generative zombie drug model
This is actually the generative model behind the Bayesian prop_model function we used to analyze the zombie data in the last chapter, but at this point, it’s still not clear how this generative model relates to Bayesian inference. We’ll come around to that eventually, but for now,20. Take this model for a spin!
take this generative model for a spin in a couple of exercises.Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.