Predictive Power
In our experiment about how much people like you when you give them different amounts of money, we found the regression equation: \(liking = 1.501 + 0.778 * money\). We can look at how well the predictor (money) describes the response variable (liking) through looking at the R squared. This tells us how much of the variance in the response variable (liking) is explained by the predictor variable (money).
One way of finding the R squared is through squaring the correlation between the predictor and response variable. To find this you can use the function cor()
, which takes your two variables, separated by a comma, as arguments. e.g. cor(variable1, variable2)
. Remember that you can square a value in R using ^2
. For example, 3^2
would return 9
.
However, another way of finding the R squared is using lm()
and summary()
. If we ask R to give us the summary of lm()
we get quite a lot of extra information in addition to our regression coeffcients, including the R squared! To use the summary()
, simply place the information you want to find information about between brackets. E.g. `summary(lm(variable1~variable2)).
This exercise is part of the course
Inferential Statistics
Exercise instructions
- In your script, calculate the R squared using the function
cor()
. - In your script, add the
summary()
function to thelm()
function you used earlier. Assign this to the objectsum
. - In your script, print
sum
. - Hit 'Submit' and compare the R squared values obtained by both of these methods!
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Vector containing the amount of money you gave participants
money <- c(1, 2, 3, 4, 5, 6, 7, 8, 9, 10)
# Vector containing the amount the participants liked you
liking <- c(2.2, 2.8, 4.5, 3.1, 8.7, 5.0, 4.5, 8.8, 9.0, 9.2)
# Calculate the R squared of our regression model using cor()
# Assign the summary of lm(liking ~ money) to 'sum'
sum <- lm(liking ~ money)
# Print sum
sum