MulaiMulai sekarang secara gratis

Quiz 2 - Question 2

A GPU has a peak performance of 500 TeraFLOPS (floating-point operations per second). You want to train a model that requires a total of 2 ExaFLOPs for one epoch. Approximately how long will one training epoch take on this GPU?

Hint: 1 ExaFLOP = 1,000 PetaFLOPs, and 1 PetaFLOP = 1,000 TeraFLOPs.

Latihan ini adalah bagian dari kursus

Google DeepMind: Accelerate Your Model

Lihat Kursus

Latihan interaktif praktis

Ubah teori menjadi tindakan dengan salah satu latihan interaktif kami.

Mulai berolahraga