Adding two random variables

1. Adding two random variables together

In the last lesson you learned how to multiply one random variable by a constant. Now you'll learn about adding multiple random variables together.

2. Adding two random variables

Suppose we defined two random variables, X and Y. X is the result of flipping ten coins that each have a point-5 probability of heads, and Y is the result of flipping one hundred coins that each have point-2 probability of heads. Assume these are independent random variables: you flipped the coins separately. Now suppose we add those two random variables together to get a random variable Z. If we get six heads in X and 22 in Y, Z would be equal to 28. Here's a histogram of the distributions of X, Y, and Z. Notice that Z is both larger and more spread out than either X or Y. While the distribution looks somewhat similar to X and Y, Z doesn't actually follow a binomial distribution, but we can still make some predictions about its properties.

3. Simulation: expected value of X + Y

We can simulate Z to find out its properties., First simulate one hundred thousand draws from X, each with 10 flips of a fair coin. As we've seen before, the expected value of X is about 5. We also simulate a hundred thousand flips from Y. You can see that the expected value of Y is about 20, which we could have predicted with 100 times point-2. Now we create a variable Z, which is X plus Y. When we take the mean, we see that the expected value of Z is about 25. Notice that that's 5 plus 20: the sum of each of the variable's means. This is a general rule: the expected value of X plus Y is the expected value of X plus the expected value of Y.

4. Simulation: variance of X + Y

What about the variance of Z? We see in the histogram that it's more spread out than either X or Y. And indeed, the variance of our simulated X is about 2-point-5, and the variance of the simulated Y is about 16, and the variance of Z is about 18-point-5. Notice that that's the variance of X plus the variance of Y. So this follows a rule much like the expected value: the variance of the sum of two independent random variables is the sum of their variances.

5. Rules for combining random variables

This gives us our two general rules for the properties of the sum of random variables. One note is that the rule for the expected value is true even if X and Y aren't independent- that is, even if outcome of one influences the probabilities of the other. However, the rule for adding together variances is true only if X and Y are independent. This is important in probability, but note that your exercises will only involve summing independent variables.

6. Let's practice!