Get startedGet started for free

Introduction

1. Introduction

Welcome to this course on Intermediate ChatGPT. I’m Alex Banks, your guide to help you unlock the full potential of ChatGPT for creative exploration.

2. ChatGPT's explosive growth

OpenAI released ChatGPT on November 30, 2022. In just five days, it had 1 million users.

3. ChatGPT's explosive growth

After two months, this number grew to 100 million, making it the fastest-growing consumer application at the time. This staggering growth beat both TikTok and Instagram to hit the 100 million benchmark.

4. ChatGPT's explosive growth

But what catalyzed this viral growth? To understand what made generative AI go mainstream, we must first understand what makes ChatGPT remarkable — large language models.

5. A primer on Large Language Models (LLMs)

Large language models simply predict the next word in a sequence of words. For example, if we have a string of words: “the cat sat on a”, what could the next word be? Well, with a 97% likelihood, the model predicts it will be “mat.”

6. A primer on Large Language Models (LLMs)

The standard training for LLMs involves forecasting the next word in a sequence of words. The LLM prediction is compared to the actual words in the text until it can generate accurate responses. This is known as next-token prediction, predicting the next word in a sequence, and masked-language modeling, predicting the masked word in the middle.

7. The dawn of AI: Recurrent models

Next-token prediction and masked-language modeling are both common training tasks for recurrent neural networks or RNNs. RNNs gained popularity in the 1990s and early 2000s because they can process data one piece at a time and remember what they have seen. But when sequences get really long, recurrent models struggle to maintain relevance and context. They use a type of memory called a "hidden state," which can only retain information for a limited amount of time. This limitation, known as the vanishing gradient problem, hampered their effectiveness in complex tasks.

8. Groundbreaking architecture: The transformer

The limitations of RNNs set the stage for a revolution. In 2017, a group of researchers at Google unveiled a groundbreaking architecture that would reshape the landscape of natural language processing or NLP. The paper, "Attention is All You Need," introduced the world to transformer architecture, a groundbreaking approach to redefining NLP. Unlike RNNs, transformers do not process data sequentially. Instead, they use attention mechanisms to weigh the significance of each word in a sequence, irrespective of its position. This ability to examine all parts of the input data simultaneously allowed for improvements in processing speed and efficiency, enabling the processing of longer text and eventually ChatGPT.

9. Welcome to Intermediate ChatGPT!

Now that you've had a taste of the technology that underpins ChatGPT let me introduce myself properly. I'm Alex Banks. I've been building and scaling AI products since 2021. I also write Sunday Signal, a newsletter that covers AI highlights and broader insights that intrigue and inspire.

10. Welcome to Intermediate ChatGPT

At the conclusion of these three chapters, you’ll be armed with the tools and techniques to impress your friends around the dinner table by not only describing the technology behind ChatGPT which you’ll learn in Chapter 1...

11. Welcome to Intermediate ChatGPT

...but also an arsenal of prompting strategies you’ll learn in chapter 2, to elicit specific, accurate, and creative responses from ChatGPT across exciting and very practical use cases.

12. Welcome to Intermediate ChatGPT

Chapter 3 will look at the advanced functions of ChatGPT, including writing custom instructions and building your very own GPTs. Let's embark on this voyage together, unraveling the mysteries of AI, which powers so much of our digital world today.

13. Let's practice!

Now, it’s time to get hands-on in the exercises.

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.