SocraticTutor LLM Wiki

Home

❯

subjects

❯

Sequence Models & Attention

Sequence Models & Attention

Apr 11, 20261 min read

  • subject/sequence-models-and-attention

Sequence Models & Attention

From RNNs to the Transformer revolution — the architectures that made large language models possible.

Topics (5)

Beginner

  • Attention Mechanism
  • Multi-Head Attention
  • RNNs & LSTMs
  • Self-Attention
  • Transformer Architecture

Graph View

  • Sequence Models & Attention
  • Topics (5)
  • Beginner

Backlinks

  • SocraticTutor Knowledge Base

Created with Quartz v4.5.2 © 2026

  • SocraticTutor
  • Learn
  • Create a Course