CommencerCommencer gratuitement

Quiz 2 - Question 2

A GPU has a peak performance of 500 TeraFLOPS (floating-point operations per second). You want to train a model that requires a total of 2 ExaFLOPs for one epoch. Approximately how long will one training epoch take on this GPU?

Hint: 1 ExaFLOP = 1,000 PetaFLOPs, and 1 PetaFLOP = 1,000 TeraFLOPs.

Cet exercice fait partie du cours

Google DeepMind: Accelerate Your Model

Afficher le cours

Exercice interactif pratique

Passez de la théorie à la pratique avec l’un de nos exercices interactifs

Commencer l’exercice