A numerical representation of a text fragment used as the fundamental unit of AI processing. Tokens are typically 3-4 characters on average, with approximately 1,300 tokens representing 1,000 words of English text. When you interact with an AI system like ChatGPT, your text is first converted into tokens before processing.
Discussed in Chapter 1 of This Is Server Country