LoslegenKostenlos loslegen

Defining custom recipes

You're fine-tuning a pre-trained Llama model for a customer who requires specific configurations. Your plan is to use TorchTune for fine-tuning, and so need to prepare a Python dictionary that you can use to store the requirements for the custom recipe you'll use to run the fine-tuning job.

Diese Übung ist Teil des Kurses

Fine-Tuning with Llama 3

Kurs anzeigen

Anleitung zur Übung

  • Specify the customer requirements in your dictionary: first, add the torchtune.models.llama3_2.llama3_2_1b model.
  • Add a batch size of 8 and a GPU device.

Interaktive Übung

Versuche dich an dieser Übung, indem du diesen Beispielcode vervollständigst.

config_dict = {
    # Define the model
    ____,
    # Define the batch size
    ____,
    # Define the device type
    ____,
    "epochs": 15,
    "optimizer": {"_component_": "bitsandbytes.optim.PagedAdamW8bit", "lr": 3e-05},
    "dataset": {"_component_": "custom_dataset"},
    "output_dir": "/tmp/finetune_results"
}
Code bearbeiten und ausführen