BaşlayınÜcretsiz Başlayın

Quiz 3 - Question 1

Assume you have a pre-trained model with 1 billion parameters. You are considering fine-tuning the model with LoRA. The LoRA matrices A and B together constitute 0.1% of the total weights. For performing inference, you need to store 4 bytes for each parameter. For training, on the other hand, you need to store 16 bytes per parameter. 1 KB is 1,000 bytes, and 1 GB is 1,000,000,000 bytes. How much memory do you need to fine-tune this 1 billion parameter model with LoRA?

Bu egzersiz

Google DeepMind: Fine-Tune Your Model

kursunun bir parçasıdır
Kursu Görüntüle

Uygulamalı interaktif egzersiz

İnteraktif egzersizlerimizden biriyle teoriyi pratiğe dökün

Egzersizi başlat