Saving custom recipes
The customer has now asked you for a modification in the requirements. This time, they'd like to increase the number of parameters and use the Llama 3.2 model with 3B parameters. You make this modification to your dictionary, and then save it as a YAML file.
The yaml library has been pre-imported.
Bu egzersiz
Fine-Tuning with Llama 3
kursunun bir parçasıdırEgzersiz talimatları
- Specify the new model requirement, the
torchtune.models.llama3_2.llama3_2_3bmodel, in your dictionary. - Save the requirements as a YAML file named
custom_recipe.yaml.
Uygulamalı interaktif egzersiz
Bu örnek kodu tamamlayarak bu egzersizi bitirin.
config_dict = {
# Update the model
____,
"batch_size": 8,
"device": "cuda",
"optimizer": {"_component_": "bitsandbytes.optim.PagedAdamW8bit", "lr": 3e-05},
"dataset": {"_component_": "custom_dataset"},
"output_dir": "/tmp/finetune_results"
}
# Save the updated configuration to a new YAML file
with open("custom_recipe.yaml", "w") as yaml_file:
____