Back to Glossary
AI & MLcontext-window

Context Window

The maximum amount of text (measured in tokens) an LLM can process in a single interaction. Larger windows enable processing more code/documentation at once. Sizes vary: GPT-4 (128K tokens), Claude (200K tokens), Gemini (1M+ tokens). One token ≈ 4 characters in English. Context window limits affect how much codebase an AI can analyze in a single request.

Related terms

2