1. The prior model
Hi! Welcome to "Bayesian Modeling with RJAGS."
I'm Alicia Johnson. I'm an associate professor of Statistics at Macalester College and will be the instructor of this course.
I assume that you've worked through the previous course in Bayesian Data Analysis, thus are familiar with the fundamental ideas behind Bayesian analysis and inference.
In this course, you'll generalize these logical, flexible, and intuitive fundamentals to more advanced Bayesian model settings.
2. Course goals
Specifically, you'll explore foundational Bayesian models, such as the Beta-Binomial, Normal-Normal, and Bayesian regression models, that are easily generalized to broader settings.
You will learn how to define, compile, and simulate these models using the RJAGS package in R.
Finally, you will learn how to use RJAGS simulation output to conduct Bayesian posterior inference. Let's start with a review.
3. Bayesian elections: The prior
Suppose you're running in an election for public office.
4. Bayesian elections: The prior
Older polls suggest that you have the support of 45% of the voters. However, due to polling errors and fluctuations in support, this figure is uncertain.
5. Bayesian elections: The prior
Engineered from past polling & election data, the prior probability model shown here captures this uncertainty: you'll most likely receive *around* 45% of the vote. It’s also unlikely, though possible, that you’ll receive as little as 30% or as great as 60% of the vote.
6. Bayesian elections: The data
To gain better insight, your campaign conducts a small poll of 10 voters. Among them, 6 (or 60%) plan to vote for you.
7. Bayesian elections: The posterior
The posterior model combines insights from the prior and these small polling data. Mainly, in light of the poll, the updated or *posterior* model of your election support is slightly more optimistic than the prior model.
8. Bayesian elections: New data
You continue to collect data. In a new poll, 48 of 90 polled voters (or 53%) plan to vote for you.
9. Bayesian elections: New posterior
In light of these new data, the posterior optimism about your election chances inches up once again.
10. Bayesian elections: Newer data
In a final poll, 166 of 300 (or 55%) of polled voters support you.
11. Bayesian elections: Newer posterior
Compelled by the information in such a large sample, your posterior optimism about receiving more than 50% of votes (hence winning the election) is very high.
12. Bayesian thinking
This election example highlights the power of Bayesian models. Not only does a Bayesian posterior model combine insights from the prior model & observed data, it continues to evolve as new data come in.
13. Building a prior model
In Chapter 1, you'll explore the three fundamental pieces of Bayesian models: the prior, likelihood, and posterior. Let's start with the prior.
Engineering and communicating a prior model requires some notation. Let p denote the proportion of voters that support you. Thus p is a value between 0 and 1. In a Bayesian analysis, we treat parameter p as a random variable. Thus the prior *model* of p is simply a probability distribution.
The Beta distribution which also lives on 0 to 1 is a natural choice here. The original prior model for p (shown here) corresponds to the Beta distribution with shape parameters 45 and 55.
We communicate this model using mathematical notation that specifies the name of the distribution (Beta) and the parameter values upon which it depends (45 and 55).
14. Tuning the prior
*Tuning* the Beta shape parameters produces alternative prior models of p, just a few of which are shown here. These range from models that reflect more pessimism about your election chances (here the Beta(1,5) in green) to models that reflect a complete lack of certainty about your chances (here the Beta(1,1) in red).
15. Let's practice!
In the following exercises you'll use simulation techniques to approximate, explore, & interpret the Beta prior model.