Skip to content

AI Providers

Clanker supports multiple AI providers as backends for its natural language processing. This page explains how provider selection works and provides a summary of all available providers.


How Provider Selection Works

Clanker determines which AI provider to use through the following priority chain:

  1. CLI flag override (--ai-profile <name>): Selects a specific provider by name. This takes the highest priority.
  2. Config file default (ai.default_provider): The provider named in this field is used when no CLI flag is specified.
  3. Built-in fallback: If neither the flag nor the config file specifies a provider, Clanker defaults to openai.

Example

bash
# Uses whatever is set as ai.default_provider in ~/.clanker.yaml
clanker ask "What EC2 instances are running?"

# Overrides to use Anthropic for this query
clanker ask --ai-profile anthropic "What EC2 instances are running?"

Per-Query Key and Model Overrides

In addition to selecting a provider by name, you can override the API key and model for a specific provider on a per-query basis using dedicated flags:

Key override flags

FlagProvider
--openai-keyopenai
--anthropic-keyanthropic
--gemini-keygemini-api
--deepseek-keydeepseek
--cohere-keycohere
--minimax-keyminimax

Model override flags

FlagProvider
--openai-modelopenai
--anthropic-modelanthropic
--gemini-modelgemini / gemini-api
--deepseek-modeldeepseek
--cohere-modelcohere
--minimax-modelminimax
--github-modelgithub-models

Example

bash
# Use OpenAI with a specific model and key for this one query
clanker ask --ai-profile openai --openai-key "$MY_KEY" --openai-model gpt-4o "List my S3 buckets"

Provider Summary

ProviderConfig KeyDefault ModelAuth Method
OpenAIopenaigpt-5API key
AnthropicanthropicAuto-detected (latest)API key
Gemini APIgemini-apigemini-2.5-flashAPI key (Google AI Studio)
Gemini (Vertex AI)geminigemini-2.0-flashApplication Default Credentials
AWS Bedrockbedrockus.anthropic.claude-sonnet-4-20250514-v1:0AWS CLI profile
DeepSeekdeepseekdeepseek-chatAPI key
Coherecoherecommand-a-03-2025API key
MiniMaxminimaxMiniMax-M2.5API key
GitHub Modelsgithub-modelsopenai/gpt-5.4GitHub CLI auth token

Choosing a Provider

Consider the following when selecting a provider:

  • Cost: Providers vary significantly in pricing. DeepSeek and Gemini API tend to be more affordable for high-volume usage.
  • Capability: Larger models (GPT-5, Claude Sonnet/Opus) generally produce more detailed and accurate infrastructure analysis.
  • Latency: Local Bedrock endpoints may offer lower latency for AWS-heavy workloads. Gemini Flash models are optimized for speed.
  • Compliance: If your organization requires that data stay within AWS, use the Bedrock provider.
  • No API key needed: The gemini provider (Vertex AI) uses GCP Application Default Credentials, and github-models uses your existing gh auth token.