An individual learned value within an AI model's neural network. Modern large language models contain billions to trillions of parameters. GPT-3 has 175 billion parameters; GPT-4 is estimated to have over 1 trillion. More parameters generally enable more sophisticated capabilities but require proportionally more computing resources for inference.
Discussed in Chapter 1 of This Is Server Country