Exercise

Using the sigmoid and softmax functions

The sigmoid and softmax functions are two of the most popular activation functions in deep learning. They are both usually used as the last step of a neural network. Sigmoid functions are used for binary classification problems, whereas softmax functions are often used for multi-class classification problems. This exercise will familiarize you with creating and using both functions.

Let's say that you have a neural network that returned the values contained in the score tensor as a pre-activation output. You will apply activation functions to this output.

torch.nn is already imported as nn.

Instructions 1/2

undefined XP
  • 1

    Create a sigmoid function and apply it on the score tensor to generate a probability.

  • 2

    Create a softmax function and apply it on the score tensor to generate a probability.