AI

Token (AI)

Basic unit of text processed by an LLM

Definition

A token is the basic unit of text that a language model processes — roughly 3–4 characters or about ¾ of a word. LLM pricing, context windows, and performance are all measured in tokens. Understanding tokens helps optimize AI API costs.

📌 Example

GPT-4 has a 128K token context window — roughly 100,000 words. Claude's 200K context window can fit an entire novel. At $15/million tokens, a 1,000-token request costs $0.015.