Get startedGet started for free

Calculating information gain of color

Now that you know the entropies of the root and child nodes, you can calculate the information gain that color provides.

In the prior exercises, you calculated entropy_root, entropy_left and entropy_right. They are available on the console.

Remember that you will take the weighted average of the child node entropies. So, you will need to calculate what proportion of the original observations ended up on the left and right side of the split. Store those in p_left and p_right, respectively.

decison tree split by color

This exercise is part of the course

Dimensionality Reduction in R

View Course

Exercise instructions

  • Calculate the split weights — that is, the proportion of observations on each side of the split.
  • Calculate the information gain using the weights and the entropies.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Calculate the split weights
p_left <- ___/12
p_right <- ___/___

# Calculate the information gain
info_gain <- ___ - 
  (___ * entropy_left +
  p_right * ___)

info_gain
Edit and Run Code