Aan de slagGa gratis aan de slag

Self vs. multi-head attention

You are a data analyst in an AI development team. Your current project involves understanding and implementing the concepts of self-attention and multi-head attention in a language model. Consider the following phrases from a conversation dataset.

A: "The boy went to the store to buy some groceries."

B: "Oh, he was really excited about getting his favorite cereal."

C: "I noticed that he gestured a lot while talking about it."

Determine if these phrases would be best analyzed by focusing on relationships within the input data (self-attention) or attending to multiple aspects of the input data simultaneously (multi-head attention).

Deze oefening maakt deel uit van de cursus

Large Language Models (LLMs) Concepts

Cursus bekijken

Praktische interactieve oefening

Zet theorie om in actie met een van onze interactieve oefeningen.

Begin met trainen