Get startedGet started for free

Quiz 3 - Question 1

You are preparing to train a 3-billion parameter language model using the Adam optimizer and standard 32-bit (float32) precision. What is the approximate GPU memory required for just the model parameters, gradients, and optimizer states?

This exercise is part of the course

Google DeepMind: Accelerate Your Model

View Course

Hands-on interactive exercise

Turn theory into action with one of our interactive exercises

Start Exercise