Providers
Understanding LLM providers and how they integrate with Agenite
What are providers?
Providers in Agenite are implementations of the LLMProvider
interface that connect your agents to specific large language model services. They translate the standardized Agenite messaging format into specific API calls and handle the response processing back into the Agenite format.
This provider abstraction allows your agents to:
- Switch between different LLM services without changing your core agent code
- Take advantage of specialized capabilities of different providers
- Maintain a consistent development experience across different LLMs
Supported providers
Agenite includes official support for several popular LLM providers:
OpenAI
The OpenAI provider connects to OpenAI’s GPT models, including GPT-3.5 Turbo and GPT-4.
Anthropic
The Anthropic provider connects to Anthropic’s Claude models.
AWS Bedrock
The AWS Bedrock provider connects to various models available on the AWS Bedrock service, including Claude, Llama, and more.
Ollama
The Ollama provider connects to locally-hosted open-source models using Ollama.
Provider configuration
Each provider has its own configuration options, but they typically include:
- Authentication credentials: API keys or other authentication methods
- Model selection: Which specific model to use
- Generation parameters: Temperature, max tokens, etc.
- Endpoint configuration: Custom endpoints or regions
Here’s an example of the configuration options for the OpenAI provider:
Using providers with agents
Providers are passed to agents during initialization:
The agent uses the provider to handle all LLM communication, making it easy to switch providers:
Creating custom providers
You can create custom providers by implementing the LLMProvider
interface. This allows you to:
- Support proprietary or internal LLM services
- Add support for new public LLM services
- Create advanced wrappers around existing providers
Here’s a simplified example of implementing a custom provider:
By extending BaseLLMProvider
, you only need to implement the generate
and stream
methods - the iterate
method will be handled for you.
Provider best practices
When working with providers, follow these best practices:
-
Store API keys securely: Never hardcode API keys or credentials. Use environment variables or secure secret management.
-
Handle rate limits: Implement appropriate retry logic and respects rate limits of the LLM service.
-
Implement timeouts: Set reasonable timeouts to handle slow or non-responsive API calls.
-
Configure for your use case: Adjust provider configuration options to match your specific use case:
- Lower temperatures (0.0-0.3) for more deterministic responses
- Higher temperatures (0.7-1.0) for more creative responses
- Appropriate max tokens based on expected response length
-
Implement error handling: Always handle errors from the LLM service gracefully.
Conclusion
Providers are a key component of Agenite’s flexibility. By implementing a common interface for different LLM services, providers enable your agents to use the best model for the job without being locked into a single vendor or technology.
In the next section, we’ll explore the concept of tools in Agenite and how they extend the capabilities of your agents.