Get startedGet started for free

Defining custom recipes

You're fine-tuning a pre-trained Llama model for a customer who requires specific configurations. Your plan is to use TorchTune for fine-tuning, and so need to prepare a Python dictionary that you can use to store the requirements for the custom recipe you'll use to run the fine-tuning job.

This exercise is part of the course

Fine-Tuning with Llama 3

View Course

Exercise instructions

  • Specify the customer requirements in your dictionary: first, add the torchtune.models.llama3_2.llama3_2_1b model.
  • Add a batch size of 8 and a GPU device.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

config_dict = {
    # Define the model
    ____,
    # Define the batch size
    ____,
    # Define the device type
    ____,
    "epochs": 15,
    "optimizer": {"_component_": "bitsandbytes.optim.PagedAdamW8bit", "lr": 3e-05},
    "dataset": {"_component_": "custom_dataset"},
    "output_dir": "/tmp/finetune_results"
}
Edit and Run Code