Add comprehensive educational materials for bidirectional attention#1
Open
Add comprehensive educational materials for bidirectional attention#1
Conversation
This commit adds extensive learning resources to help understand bidirectional attention and modern LLM architectures: Educational Content: - docs/bidirectional_attention_tutorial.md: Deep dive into bidirectional vs causal attention with mathematical formulations and examples - LEARNING_GUIDE.md: Structured 7-phase learning path with exercises - docs/quick_reference.md: One-page reference for quick lookups Interactive Tools: - attention_comparison.py: Side-by-side comparison of causal vs bidirectional attention with visualizations - visualize_model_attention.py: Extract and visualize attention patterns from the trained diffusion model Enhanced Code: - model.py: Added extensive inline comments to BidirectionalAttention, apply_rotary_emb, and norm functions explaining every design decision, shape transformation, and architectural choice These materials enable aspiring LLM researchers to: 1. Deeply understand bidirectional attention mechanisms 2. Compare causal (GPT-style) vs bidirectional (BERT-style) attention 3. Learn modern components: RoPE, RMSNorm, QK normalization 4. Visualize attention patterns interactively 5. Understand when to use each attention type
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This commit adds extensive learning resources to help understand
bidirectional attention and modern LLM architectures:
Educational Content:
vs causal attention with mathematical formulations and examples
Interactive Tools:
bidirectional attention with visualizations
from the trained diffusion model
Enhanced Code:
apply_rotary_emb, and norm functions explaining every design decision,
shape transformation, and architectural choice
These materials enable aspiring LLM researchers to: