Get startedGet started for free

Congratulations!

1. Congratulations!

You did it! Congratulations on making your very own transformer models! Let's recap on what you've accomplished.

2. Chapter 1

In Chapter 1, you learned about the key components and processes in the transformer architecture, including token embeddings, positional encoding, and attention mechanisms. You started your transformer journey using torch.nn's Transformer class to quickly define a transformer, then began creating your own classes for encoding token, positional, and attention score information.

3. Chapter 2 - Encoders and Decoders

With all the components in place, you built encoder-only and decoder-only transformers, the latter of which is used in most modern large language models. You built these in a modular way, defining encoder and decoder layer classes, then combining these into blocks of layers.

4. Chapter 2 - Encoder-decoder transformer

Finally, you created an encoder-decoder transformer by passing outputs from the encoder block into the decoder block and implementing a cross-attention mechanism.

5. What next?

The next natural step for your model is to begin training it. Training transformers to generate meaningful text often requires GPUs and significant computational resources and training time. You can learn about efficient training techniques in the following course. In many cases, building and training a transformer from scratch may not be necessary, and you can actually start from existing pre-trained LLMs made available to the AI community. Hugging Face is an ecosystem that makes models and datasets freely available to the community. Explore Hugging Face and work with pre-trained LLMs in the following courses. These pre-trained LLMs may not be sufficiently performant out-of-the-box. In these cases, you may need to fine-tune a pre-trained LLM to your custom dataset, which involves updating a subset of the model parameters. You can learn how to do this with Meta's Llama model in this course. One final thing: I recommend checking out the original Attention Is All You Need paper. You may feel like academic papers are beyond your interest or capabilities, but I think you'd be surprised by how much you can now understand.

6. Let's practice!

Congratulations again! We hope to see you in another course soon!