AI Providers
Clanker supports multiple AI providers as backends for its natural language processing. This page explains how provider selection works and provides a summary of all available providers.
How Provider Selection Works
Clanker determines which AI provider to use through the following priority chain:
- CLI flag override (
--ai-profile <name>): Selects a specific provider by name. This takes the highest priority. - Config file default (
ai.default_provider): The provider named in this field is used when no CLI flag is specified. - Built-in fallback: If neither the flag nor the config file specifies a provider, Clanker defaults to
openai.
Example
bash
# Uses whatever is set as ai.default_provider in ~/.clanker.yaml
clanker ask "What EC2 instances are running?"
# Overrides to use Anthropic for this query
clanker ask --ai-profile anthropic "What EC2 instances are running?"Per-Query Key and Model Overrides
In addition to selecting a provider by name, you can override the API key and model for a specific provider on a per-query basis using dedicated flags:
Key override flags
| Flag | Provider |
|---|---|
--openai-key | openai |
--anthropic-key | anthropic |
--gemini-key | gemini-api |
--deepseek-key | deepseek |
--cohere-key | cohere |
--minimax-key | minimax |
Model override flags
| Flag | Provider |
|---|---|
--openai-model | openai |
--anthropic-model | anthropic |
--gemini-model | gemini / gemini-api |
--deepseek-model | deepseek |
--cohere-model | cohere |
--minimax-model | minimax |
--github-model | github-models |
Example
bash
# Use OpenAI with a specific model and key for this one query
clanker ask --ai-profile openai --openai-key "$MY_KEY" --openai-model gpt-4o "List my S3 buckets"Provider Summary
| Provider | Config Key | Default Model | Auth Method |
|---|---|---|---|
| OpenAI | openai | gpt-5 | API key |
| Anthropic | anthropic | Auto-detected (latest) | API key |
| Gemini API | gemini-api | gemini-2.5-flash | API key (Google AI Studio) |
| Gemini (Vertex AI) | gemini | gemini-2.0-flash | Application Default Credentials |
| AWS Bedrock | bedrock | us.anthropic.claude-sonnet-4-20250514-v1:0 | AWS CLI profile |
| DeepSeek | deepseek | deepseek-chat | API key |
| Cohere | cohere | command-a-03-2025 | API key |
| MiniMax | minimax | MiniMax-M2.5 | API key |
| GitHub Models | github-models | openai/gpt-5.4 | GitHub CLI auth token |
Choosing a Provider
Consider the following when selecting a provider:
- Cost: Providers vary significantly in pricing. DeepSeek and Gemini API tend to be more affordable for high-volume usage.
- Capability: Larger models (GPT-5, Claude Sonnet/Opus) generally produce more detailed and accurate infrastructure analysis.
- Latency: Local Bedrock endpoints may offer lower latency for AWS-heavy workloads. Gemini Flash models are optimized for speed.
- Compliance: If your organization requires that data stay within AWS, use the Bedrock provider.
- No API key needed: The
geminiprovider (Vertex AI) uses GCP Application Default Credentials, andgithub-modelsuses your existingghauth token.