LLM Configuration
Configuring the Large Language Models (LLMs) is crucial for defining how agents reason and process information. This section explains how to specify LLMs for your agents using tabs for better organization.
Why is LLM Configuration Important?
- Intelligence: LLMs empower agents with the ability to understand and generate human-like text.
- Customization: Different models offer varying capabilities.
- Performance: Selecting the appropriate model affects the agent's efficiency and response quality.
YAML Configuration for LLMs
Each agent can be configured with one or more LLMs. Here's how you can define them:
LLM_config:
params:
model: "<Model Name>"
temperature: <Value>
max_tokens: <Value>
request_timeout: <Value>
Explanation of Parameters
- model: Specifies the model variant to use.
- temperature:
- Controls the randomness of the output.
- A lower value (e.g., 0.0) makes output deterministic.
- A higher value (e.g., 0.7) allows for more creativity.
- max_tokens: Maximum number of tokens in the output.
- request_timeout: Time in seconds before the request times out.