Posts by Tags

attention

Attention is all You Need

14 minute read

Published:

The Core Concepts

Self-Attention 🧠 is a mechanism that allows a sequence (like a sentence) to interact with itself. It calculates how much “attention” each word should pay to every other word in the same sentence. For example, in the sentence “The animal didn’t cross the street because it was too tired,” self-attention helps the model realize that “it” refers to the animal, not the street.

large language models

Attention is all You Need

14 minute read

Published:

The Core Concepts

Self-Attention 🧠 is a mechanism that allows a sequence (like a sentence) to interact with itself. It calculates how much “attention” each word should pay to every other word in the same sentence. For example, in the sentence “The animal didn’t cross the street because it was too tired,” self-attention helps the model realize that “it” refers to the animal, not the street.

llm

Attention is all You Need

14 minute read

Published:

The Core Concepts

Self-Attention 🧠 is a mechanism that allows a sequence (like a sentence) to interact with itself. It calculates how much “attention” each word should pay to every other word in the same sentence. For example, in the sentence “The animal didn’t cross the street because it was too tired,” self-attention helps the model realize that “it” refers to the animal, not the street.

machine learning

Attention is all You Need

14 minute read

Published:

The Core Concepts

Self-Attention 🧠 is a mechanism that allows a sequence (like a sentence) to interact with itself. It calculates how much “attention” each word should pay to every other word in the same sentence. For example, in the sentence “The animal didn’t cross the street because it was too tired,” self-attention helps the model realize that “it” refers to the animal, not the street.