Get Started

Calculating slopes

You're now going to practice calculating slopes. When plotting the mean-squared error loss function against predictions, the slope is 2 * x * (xb-y), or 2 * input_data * error. Note that x and b may have multiple numbers (x is a vector for each data point, and b is a vector). In this case, the output will also be a vector, which is exactly what you want.

You're ready to write the code to calculate this slope while using a single data point. You'll use pre-defined weights called weights as well as data for a single point called input_data. The actual value of the target you want to predict is stored in target.

This is a part of the course

“Introduction to Deep Learning in Python”

View Course

Exercise instructions

  • Calculate the predictions, preds, by multiplying weights by the input_data and computing their sum.
  • Calculate the error, which is preds minus target. Notice that this error corresponds to xb-y in the gradient expression.
  • Calculate the slope of the loss function with respect to the prediction. To do this, you need to take the product of input_data and error and multiply that by 2.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Calculate the predictions: preds
preds = ____

# Calculate the error: error
error = ____ - ____

# Calculate the slope: slope
slope = ____ * ____ * ____

# Print the slope
print(slope)
Edit and Run Code

This exercise is part of the course

Introduction to Deep Learning in Python

IntermediateSkill Level
4.7+
49 reviews

Learn the fundamentals of neural networks and how to build deep learning models using Keras 2.0 in Python.

Learn how to optimize the predictions generated by your neural networks. You'll use a method called backward propagation, which is one of the most important techniques in deep learning. Understanding how it works will give you a strong foundation to build on in the second half of the course.

Exercise 1: The need for optimizationExercise 2: Calculating model errorsExercise 3: Understanding how weights change model accuracyExercise 4: Coding how weight changes affect accuracyExercise 5: Scaling up to multiple data pointsExercise 6: Gradient descentExercise 7: Calculating slopes
Exercise 8: Improving model weightsExercise 9: Making multiple updates to weightsExercise 10: BackpropagationExercise 11: The relationship between forward and backward propagationExercise 12: Thinking about backward propagationExercise 13: Backpropagation in practiceExercise 14: A round of backpropagation

What is DataCamp?

Learn the data skills you need online at your own pace—from non-coding essentials to data science and machine learning.

Start Learning for Free