Providers
altimate supports 35+ LLM providers. Configure them in the provider section of your config file.
Provider Configuration
Each provider has a key in the provider object:
{
"provider": {
"<provider-name>": {
"apiKey": "{env:API_KEY}",
"baseURL": "https://custom.endpoint.com/v1",
"headers": {
"X-Custom-Header": "value"
}
}
}
}
Tip
Use {env:...} substitution for API keys so you never commit secrets to version control.
Anthropic
{
"provider": {
"anthropic": {
"apiKey": "{env:ANTHROPIC_API_KEY}"
}
},
"model": "anthropic/claude-sonnet-4-6"
}
Available models: claude-opus-4-6, claude-sonnet-4-6, claude-haiku-4-5-20251001
OpenAI
{
"provider": {
"openai": {
"apiKey": "{env:OPENAI_API_KEY}"
}
},
"model": "openai/gpt-4o"
}
AWS Bedrock
{
"provider": {
"bedrock": {
"region": "us-east-1",
"accessKeyId": "{env:AWS_ACCESS_KEY_ID}",
"secretAccessKey": "{env:AWS_SECRET_ACCESS_KEY}"
}
},
"model": "bedrock/anthropic.claude-sonnet-4-6-v1"
}
Uses the standard AWS credential chain. Set AWS_PROFILE or provide credentials directly.
Note
If you have AWS SSO or IAM roles configured, Bedrock will use your default credential chain automatically, so no explicit keys are needed.
Azure OpenAI
{
"provider": {
"azure": {
"apiKey": "{env:AZURE_OPENAI_API_KEY}",
"baseURL": "https://your-resource.openai.azure.com/openai/deployments/your-deployment"
}
},
"model": "azure/gpt-4o"
}
Google (Gemini)
{
"provider": {
"google": {
"apiKey": "{env:GOOGLE_API_KEY}"
}
},
"model": "google/gemini-2.5-pro"
}
Google Vertex AI
{
"provider": {
"google-vertex": {
"project": "my-gcp-project",
"location": "us-central1"
}
},
"model": "google-vertex/gemini-2.5-pro"
}
Uses Google Cloud Application Default Credentials. Authenticate with:
gcloud auth application-default login
The project and location fields can also be set via environment variables:
| Field | Environment Variables (checked in order) |
|---|---|
project |
GOOGLE_CLOUD_PROJECT, GCP_PROJECT, GCLOUD_PROJECT |
location |
GOOGLE_VERTEX_LOCATION, GOOGLE_CLOUD_LOCATION, VERTEX_LOCATION |
If location is not set, it defaults to us-central1.
Tip
You can also access Anthropic models through Vertex AI using the google-vertex provider (e.g., google-vertex/claude-sonnet-4-6).
Ollama (Local)
{
"provider": {
"ollama": {
"baseURL": "http://localhost:11434"
}
},
"model": "ollama/llama3.1"
}
No API key needed. Runs entirely on your local machine.
Info
Make sure Ollama is running before starting altimate. Install it from ollama.com and pull your desired model with ollama pull llama3.1.
OpenRouter
{
"provider": {
"openrouter": {
"apiKey": "{env:OPENROUTER_API_KEY}"
}
},
"model": "openrouter/anthropic/claude-sonnet-4-6"
}
Access 150+ models through a single API key.
Copilot
{
"provider": {
"copilot": {}
},
"model": "copilot/gpt-4o"
}
Uses your GitHub Copilot subscription. Authenticate with altimate auth.
Custom / OpenAI-Compatible
Any OpenAI-compatible endpoint can be used as a provider:
{
"provider": {
"my-provider": {
"api": "openai",
"baseURL": "https://my-llm-proxy.example.com/v1",
"apiKey": "{env:MY_API_KEY}"
}
},
"model": "my-provider/my-model"
}
Tip
This works with any service that exposes an OpenAI-compatible chat completions API, including vLLM, LiteLLM, and self-hosted inference servers.
Model Selection
Set your default model and a smaller model for lightweight tasks:
{
"model": "anthropic/claude-sonnet-4-6",
"small_model": "anthropic/claude-haiku-4-5-20251001"
}
The small_model is used for lightweight tasks like summarization and context compaction.
Provider Options Reference
| Field | Type | Description |
|---|---|---|
apiKey |
string |
API key (supports {env:...} and {file:...}) |
baseURL |
string |
Custom API endpoint URL |
api |
string |
API type (e.g., "openai" for compatible endpoints) |
headers |
object |
Custom HTTP headers to include with requests |
region |
string |
AWS region (Bedrock only) |
accessKeyId |
string |
AWS access key (Bedrock only) |
secretAccessKey |
string |
AWS secret key (Bedrock only) |
project |
string |
GCP project ID (Google Vertex AI only) |
location |
string |
GCP region (Google Vertex AI only, default: us-central1) |