Now available: This Is Server CountryGet the book
Computing & AI

attention mechanism

A mathematical process that allows AI models to focus on relevant parts of input when generating output. Attention mechanisms enable transformers to understand context by weighing the importance of different tokens relative to each other. This is why AI can understand that "bank" means different things in "river bank" versus "bank account."

Referenced in the Book

Discussed in Chapter 1 of This Is Server Country

Related Terms

Back to Glossary View all "A" terms