Token

The smallest unit of text a language model processes — not a word, not a character, but something in between. One token is roughly ¾ of an English word. "Hamburger" is three tokens. A newline is one.

Everything in LLM-land is measured in tokens: context window size, output limits, API pricing. When someone says a model has a 200k context window, they mean 200,000 tokens — roughly 150,000 words.

← all terms
12 posts tools lexicon follows rss © 2026 siever.ing