Core APIs
Provider APIs
Reference for LLM providers in Agenite
Overview
Agenite supports multiple LLM providers through a consistent interface. Each provider implements the LLMProvider
interface from the @agenite/llm
package.
Common interface
Most providers extend the BaseLLMProvider
class, which implements the iterate
method for you, so you only need to implement generate
and stream
.
OpenAI provider
Installation
Usage
Configuration
Anthropic provider
Installation
Usage
Configuration
AWS Bedrock provider
Installation
Usage
Configuration
Ollama provider
Installation
Usage
Configuration
Creating custom providers
You can create custom providers by extending the BaseLLMProvider
class:
Response formats
All providers return responses in a standardized format:
Best practices
- Error handling: Implement robust error handling for API failures
- Rate limiting: Respect provider rate limits
- Token tracking: Always include accurate token usage metrics
- Streaming: Implement efficient streaming to minimize latency
- Content mapping: Properly map specific responses to Agenite’s standard format
Next steps
- Learn about the Agent API
- Explore the Tool API
- Read about middleware