Get startedGet started for free

Multiple linear regression

In most cases, performing a univariate linear regression will not yield a model that is useful for making accurate predictions. In this exercise, you will perform a multiple regression, which uses more than one feature.

You will use price_log as your target and size_log and bedrooms as your features. Each of these tensors has been defined and is available. You will also switch from using the the mean squared error loss to the mean absolute error loss: keras.losses.mae(). Finally, the predicted values are computed as follows: params[0] + feature1*params[1] + feature2*params[2]. Note that we've defined a vector of parameters, params, as a variable, rather than using three variables. Here, params[0] is the intercept and params[1] and params[2] are the slopes.

This exercise is part of the course

Introduction to TensorFlow in Python

View Course

Exercise instructions

  • Define a linear regression model that returns the predicted values.
  • Set loss_function() to take the parameter vector as an input.
  • Use the mean absolute error loss.
  • Complete the minimization operation.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Define the linear regression model
def linear_regression(params, feature1 = size_log, feature2 = bedrooms):
	return params[0] + feature1*____ + feature2*____

# Define the loss function
def loss_function(____, targets = price_log, feature1 = size_log, feature2 = bedrooms):
	# Set the predicted values
	predictions = linear_regression(params, feature1, feature2)
  
	# Use the mean absolute error loss
	return keras.losses.____(targets, predictions)

# Define the optimize operation
opt = keras.optimizers.Adam()

# Perform minimization and print trainable variables
for j in range(10):
	opt.minimize(lambda: loss_function(____), var_list=[____])
	print_results(params)
Edit and Run Code