Configuration File Reference
Clanker reads its configuration from ~/.clanker.yaml. This page documents every supported key, its type, default value, and behavior.
You can also specify a custom config file path using the --config flag:
clanker --config /path/to/custom.yaml ask "What EC2 instances are running?"ai
Controls which AI provider Clanker uses for language model calls.
ai:
# The provider to use by default. If omitted, Clanker falls back to "openai".
# Valid values: openai, anthropic, gemini-api, gemini, bedrock, deepseek, cohere, minimax, github-models
default_provider: gemini-api # string
# Maximum prompt size in characters before Clanker triggers automatic summarization.
# Default: 120000
max_prompt_chars: 120000 # int
# Size in characters of each chunk when splitting large contexts for summarization.
# Default: 120000
chunk_chars: 120000 # int
# Maximum number of context chunks to process during summarization.
# Default: 6
max_chunks: 6 # int
providers:
# Each key under providers is a named provider profile.
# You can reference it with --ai-profile <name> or set it as default_provider.
openai:
model: gpt-5 # string - Model name to use
api_key: "" # string - API key (stored directly in config)
api_key_env: "" # string - Name of environment variable containing the key
anthropic:
model: "" # string - Model name (auto-detected if empty; uses latest)
api_key: "" # string - API key
api_key_env: "" # string - Name of environment variable containing the key
gemini-api:
model: gemini-2.5-flash # string - Model name for Google AI Studio
api_key: "" # string - API key from Google AI Studio
api_key_env: GEMINI_API_KEY # string - Name of environment variable containing the key
gemini:
model: gemini-2.0-flash # string - Model name for Vertex AI / Application Default Credentials
# No API key needed; uses gcloud application-default credentials
bedrock:
aws_profile: "" # string - AWS CLI profile with Bedrock access
region: us-east-1 # string - AWS region for Bedrock endpoint
model: us.anthropic.claude-sonnet-4-20250514-v1:0 # string - Bedrock model ID
deepseek:
model: deepseek-chat # string - Model name (deepseek-chat or deepseek-reasoner)
api_key: "" # string - API key
api_key_env: DEEPSEEK_API_KEY # string - Environment variable name
cohere:
model: command-a-03-2025 # string - Model name
api_key: "" # string - API key
api_key_env: COHERE_API_KEY # string - Environment variable name
minimax:
model: MiniMax-M2.5 # string - Model name
api_key: "" # string - API key
api_key_env: MINIMAX_API_KEY # string - Environment variable name
github-models:
model: openai/gpt-5.4 # string - Model name on GitHub Models
# No API key needed; uses gh auth tokenProvider key resolution order
For each provider, Clanker resolves the API key in the following order:
- CLI flag (e.g.,
--openai-key,--gemini-key) api_keyfield in the provider config- Environment variable named in the
api_key_envfield - Well-known environment variable (e.g.,
OPENAI_API_KEY,GEMINI_API_KEY)
infra
Defines cloud infrastructure targets and credentials.
infra:
# Which cloud provider to use by default.
# Valid values: aws, gcp, azure
default_provider: aws # string
# Which named environment to use by default when querying AWS.
default_environment: dev # string
aws:
environments:
# Each key is a named environment. You can have as many as needed.
dev:
profile: your-dev-profile # string - AWS CLI profile name
region: us-east-1 # string - AWS region
stage:
profile: your-stage-profile # string
region: us-east-1 # string
prod:
profile: your-prod-profile # string
region: us-west-2 # string
gcp:
project_id: your-gcp-project-id # string - GCP project ID
azure:
subscription_id: "" # string - Azure subscription IDgithub
Configuration for GitHub integration, used by clanker ask --github queries.
github:
token: "ghp_..." # string - GitHub personal access token (optional for public repos)
owner: your-org # string - Repository owner or organization
repo: your-repo # string - Repository namepostgres
Configuration for PostgreSQL connections, used by clanker postgres commands.
postgres:
default_connection: dev # string - Name of the default connection
connections:
dev:
host: localhost # string - Database hostname
port: 5432 # int - Database port
database: your_db # string - Database name
username: postgres # string - Database username
production:
host: prod-db.example.com # string
port: 5432 # int
database: prod_db # string
username: app_user # stringterraform
Configuration for Terraform workspace integration, used by clanker terraform and clanker ask --terraform.
terraform:
default_workspace: dev # string - Name of the default workspace
workspaces:
dev:
path: /path/to/infra # string - Filesystem path to the Terraform root module
production:
path: /path/to/prod-infra # stringkubernetes
Configuration for Kubernetes cluster access, used by clanker k8s ask.
kubernetes:
kubeconfig: "" # string - Path to kubeconfig file (default: ~/.kube/config)
default_context: "" # string - Default kubectl context to use
default_namespace: "" # string - Default namespace (empty means all namespaces)
clusters:
production:
type: eks # string - "eks" or "existing"
name: prod-cluster # string - EKS cluster name (required for eks type)
profile: prod-aws # string - AWS profile for EKS authentication
region: us-east-1 # string - AWS region for the EKS cluster
staging:
type: eks # string
name: staging-cluster # string
profile: dev-aws # string
region: us-west-2 # string
local:
type: existing # string - Use an existing kubectl context
context: minikube # string - Name of the kubectl contextcloudflare
Configuration for Cloudflare, used by clanker cf ask and clanker ask --cloudflare.
cloudflare:
api_token: "" # string - Cloudflare API token (or set CLOUDFLARE_API_TOKEN / CF_API_TOKEN)
account_id: "" # string - Cloudflare account ID (or set CLOUDFLARE_ACCOUNT_ID / CF_ACCOUNT_ID)
default_zone: "" # string - Default zone name for DNS operationsdigitalocean
Configuration for DigitalOcean, used by clanker ask --digitalocean.
digitalocean:
api_token: "" # string - DigitalOcean API token (or set DO_API_TOKEN / DIGITALOCEAN_ACCESS_TOKEN)hetzner
Configuration for Hetzner Cloud, used by clanker ask --hetzner.
hetzner:
api_token: "" # string - Hetzner Cloud API token (or set HCLOUD_TOKEN)hermes
Configuration for the Hermes Agent, used by clanker talk and clanker ask --agent hermes.
hermes:
path: "./vendor/hermes-agent" # string - Path to Hermes binary (auto-detected if omitted)
model: "anthropic/claude-opus-4" # string - Model to use via OpenRouter
base_url: "https://openrouter.ai/api/v1" # string - Base URL for the model provider
openrouter_api_key: "" # string - OpenRouter API key (or set OPENROUTER_API_KEY)backend
Integration with the Clanker backend for storing and retrieving credentials across machines.
backend:
api_key: "" # string - Backend API key (or set CLANKER_BACKEND_API_KEY)
env: "testing" # string - Backend environment: testing, staging, or production (or set CLANKER_BACKEND_ENV)
url: "" # string - Custom backend URL, overrides env if set (or set CLANKER_BACKEND_URL)aws (service keywords)
Optional service keyword mappings used by the internal routing engine. This is a top-level aws key, separate from infra.aws.
aws:
service_keywords:
api: [api, gateway]
serverless: [lambda, step-functions]timeout
Global timeout for operations, in seconds.
timeout: 30 # int - Timeout in secondsFull Annotated Example
Below is a complete ~/.clanker.yaml showing all sections together:
ai:
default_provider: gemini-api
providers:
gemini-api:
model: gemini-2.5-flash
api_key_env: GEMINI_API_KEY
openai:
model: gpt-5
api_key: ""
anthropic:
model: ""
api_key_env: ANTHROPIC_API_KEY
bedrock:
aws_profile: bedrock-profile
region: us-east-1
model: us.anthropic.claude-sonnet-4-20250514-v1:0
deepseek:
model: deepseek-chat
api_key_env: DEEPSEEK_API_KEY
cohere:
model: command-a-03-2025
api_key_env: COHERE_API_KEY
minimax:
model: MiniMax-M2.5
api_key_env: MINIMAX_API_KEY
github-models:
model: openai/gpt-5.4
infra:
default_provider: aws
default_environment: dev
aws:
environments:
dev:
profile: my-dev-profile
region: us-east-1
prod:
profile: my-prod-profile
region: us-west-2
gcp:
project_id: my-gcp-project
azure:
subscription_id: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
github:
token: "ghp_xxxxxxxxxxxxxxxxxxxx"
owner: my-org
repo: my-repo
postgres:
default_connection: dev
connections:
dev:
host: localhost
port: 5432
database: mydb
username: postgres
terraform:
default_workspace: dev
workspaces:
dev:
path: /home/user/infra/dev
kubernetes:
kubeconfig: ""
default_context: ""
default_namespace: default
clusters:
production:
type: eks
name: prod-cluster
profile: prod-aws
region: us-east-1
cloudflare:
api_token: ""
account_id: ""
default_zone: example.com
digitalocean:
api_token: ""
hetzner:
api_token: ""
hermes:
path: "./vendor/hermes-agent"
model: "anthropic/claude-opus-4"
base_url: "https://openrouter.ai/api/v1"
openrouter_api_key: ""
backend:
api_key: ""
env: "testing"
timeout: 30