Transformer Architecture and Attention Mechanism
Master the foundations of Transformers and attention mechanisms to power AI applications. Understand tokenization, embeddings, and self-attention to enhance your AI skills.
Context
Why This Matters?
Learning Objectives
Strengths and Weaknesses in LLMs
Discriminative vs Generative AI
Quiz: Generative vs Discriminative AI
Transformers
Embeddings
Attention
Quiz: Introduction to Transformers
Similarity
Activity - Three Ways to Measure Similarity
Keys and Queries Matrices
Linear Transformations: How Weights Change Similarity
Multi-Head Attention
Attention Mechanism
Sentiment Analysis
Applications
Quiz: Attention Mechanism
Feedforward Neural Networks
Types of Neural Networks
Softmax
How Do We Build Embeddings?
Tokenization
Positional Encoding
Transformer Architecture
Revisiting LLMs
Quiz: Neural Networks and LLM Fundamentals
Introduction
Learning Objectives
How Does Tokenization Work?
Tokenizing a Song
Impact of Language Complexity
Punctuation Sensitivity