Get startedGet started for free

Quiz 2 - Question 2

A GPU has a peak performance of 500 TeraFLOPS (floating-point operations per second). You want to train a model that requires a total of 2 ExaFLOPs for one epoch. Approximately how long will one training epoch take on this GPU?

Hint: 1 ExaFLOP = 1,000 PetaFLOPs, and 1 PetaFLOP = 1,000 TeraFLOPs.

This exercise is part of the course

Google DeepMind: Accelerate Your Model

View Course

Hands-on interactive exercise

Turn theory into action with one of our interactive exercises

Start Exercise