Now available: This Is Server CountryGet the book
Computing & AI

token

/TOH-kuhn/

A numerical representation of a text fragment used as the fundamental unit of AI processing. Tokens are typically 3-4 characters on average, with approximately 1,300 tokens representing 1,000 words of English text. When you interact with an AI system like ChatGPT, your text is first converted into tokens before processing.

Referenced in the Book

Discussed in Chapter 1 of This Is Server Country

Related Terms

Back to Glossary View all "T" terms