AI & MLcontext-window
Context Window
The maximum amount of text (measured in tokens) an LLM can process in a single interaction. Larger windows enable processing more code/documentation at once. Sizes vary: GPT-4 (128K tokens), Claude (200K tokens), Gemini (1M+ tokens). One token ≈ 4 characters in English. Context window limits affect how much codebase an AI can analyze in a single request.
Related terms
2AI & ML
LLM (Large Language Model)
A neural network trained on vast text corpora to understand and generate human language. LLMs (GPT-4, Claude, Llama, Gem...
AI & ML
Token (AI/NLP)
The basic unit of text processed by language models—typically a word, subword, or character. Tokenizers (BPE, SentencePi...