Now available: This Is Server CountryGet the book
Computing & AI

transformer

The dominant neural network architecture for modern AI systems, introduced in 2017. Transformers use attention mechanisms to process input sequences in parallel, enabling them to understand relationships between words regardless of their position in text. GPT, Claude, and most large language models are built on transformer architecture.

Referenced in the Book

Discussed in Chapter 1 of This Is Server Country

Related Terms

Back to Glossary View all "T" terms