Skip to content

Quick Start

This guide walks through common Clanker workflows with practical examples. Before proceeding, make sure you have installed Clanker and created a configuration file.

Basic AWS Query

The most common usage pattern is asking a natural language question about your AWS infrastructure:

bash
clanker ask "list my ec2 instances" --aws

Clanker will determine which AWS API calls are needed, execute them using your configured AWS profile, and return a readable summary of the results.

If your question clearly refers to AWS (for example, it mentions EC2, Lambda, or S3), you can omit the --aws flag entirely and let Clanker infer the provider:

bash
clanker ask "what lambda functions have the most errors?"

Using a Specific AWS Profile

Override the default AWS profile with the --profile flag:

bash
clanker ask "show lambda errors in the last 24 hours" --aws --profile production

This is useful when you work with multiple AWS accounts (development, staging, production) and want to query a specific one.

Multi-Cloud Queries

You can include context from multiple providers in a single query:

bash
clanker ask "show my infrastructure" --aws --gcp

This gathers data from both AWS and GCP and presents a combined view.

For Azure:

bash
clanker ask "list my virtual machines" --azure

For Cloudflare:

bash
clanker ask "show my DNS zones" --cloudflare

For DigitalOcean:

bash
clanker ask "list my droplets" --digitalocean

For Hetzner:

bash
clanker ask "show my cloud servers" --hetzner

For Verda (GPU/AI):

bash
clanker ask "what GPU instances are running?" --verda
clanker ask "how much am I spending on Verda this month?" --verda

GitHub Context

Query your GitHub repositories for pull requests, issues, workflow runs, and more:

bash
clanker ask "show recent pull requests" --github
bash
clanker ask "what GitHub Actions workflows failed this week?" --github

Make sure your ~/.clanker.yaml has a valid GitHub token configured if you are querying private repositories.

MCP Quick Start

If you want to use Clanker from an MCP-compatible client, start the local MCP server:

bash
clanker mcp --transport http --listen 127.0.0.1:39393

Then initialize the session and inspect the available tools:

bash
curl -sS -X POST http://127.0.0.1:39393/mcp \
	-H 'Content-Type: application/json' \
	-H 'Accept: application/json, text/event-stream' \
	--data '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2025-03-26","capabilities":{},"clientInfo":{"name":"local-cli","version":"1.0"}}}'
bash
curl -sS -X POST http://127.0.0.1:39393/mcp \
	-H 'Content-Type: application/json' \
	-H 'Accept: application/json, text/event-stream' \
	--data '{"jsonrpc":"2.0","id":2,"method":"tools/list","params":{}}'

You can also use MCP to ask Clanker how it would route a question before executing anything:

bash
curl -sS -X POST http://127.0.0.1:39393/mcp \
	-H 'Content-Type: application/json' \
	-H 'Accept: application/json, text/event-stream' \
	--data '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"clanker_route_question","arguments":{"question":"use clanker cloud mcp to show my saved settings"}}}'

For the full command reference, see mcp.

Terraform Queries

Inspect your Terraform state and workspaces:

bash
clanker ask "what resources are in my terraform state?" --terraform
bash
clanker ask "show me the terraform plan" --terraform --workspace staging

Clanker can also execute Terraform commands directly when the query clearly requests it (for example, "run terraform init" or "run terraform plan").

Kubernetes Management

Clanker provides a dedicated k8s subcommand for Kubernetes operations.

Querying Clusters with Natural Language

bash
clanker k8s ask "show pod status in all namespaces"
bash
clanker k8s ask "which pods are using the most memory?"
bash
clanker k8s ask "why is my pod crashing?" --cluster my-cluster --profile myaws

The k8s ask command uses the same three-stage pipeline as clanker ask, but specialized for Kubernetes: it translates your question into kubectl operations, executes them, and synthesizes a response. Conversation history is maintained per cluster for follow-up questions.

Creating Clusters

Create an EKS cluster:

bash
clanker k8s create eks my-cluster --nodes 2 --node-type t3.small

Create a GKE cluster:

bash
clanker k8s create gke my-cluster --gcp-project my-project --nodes 2 --node-type e2-standard-2

Create a kubeadm cluster on EC2:

bash
clanker k8s create kubeadm my-cluster --workers 2 --node-type t3.small

Add --plan to any create command to preview the execution plan without applying it.

Deploying Applications

bash
clanker k8s deploy nginx --name my-nginx --port 80 --replicas 3

Viewing Logs and Metrics

bash
clanker k8s logs my-pod --tail 100 --since 1h
bash
clanker k8s stats nodes
bash
clanker k8s stats pods --all-namespaces --sort-by memory
bash
clanker k8s stats cluster

Cost Analysis

View and analyze cloud infrastructure costs across all configured providers:

bash
clanker cost summary

Filter by provider:

bash
clanker cost summary --provider aws

View service-level cost breakdowns:

bash
clanker cost detail --provider aws

See daily cost trends over a specific period:

bash
clanker cost trend --start 2025-01-01 --end 2025-01-31

Get a cost forecast:

bash
clanker cost forecast

Detect cost anomalies:

bash
clanker cost anomalies

View costs grouped by tags:

bash
clanker cost tags --key Environment

Export cost data to a file:

bash
clanker cost export --output costs.csv --provider aws

Infrastructure Deployment

Deploy a GitHub repository to the cloud with a single command:

bash
clanker deploy https://github.com/user/repo

This clones the repository, analyzes the codebase, determines the best deployment strategy, and generates a JSON execution plan. To apply the plan immediately:

bash
clanker deploy https://github.com/user/repo --apply

Targeting Specific Providers and Deployment Methods

Deploy to AWS EC2:

bash
clanker deploy https://github.com/user/repo --provider aws --target ec2

Deploy to AWS Fargate (the default for AWS):

bash
clanker deploy https://github.com/user/repo --provider aws --target fargate

Deploy to GCP:

bash
clanker deploy https://github.com/user/repo --provider gcp --gcp-project my-project --apply

Deploy to Azure:

bash
clanker deploy https://github.com/user/repo --provider azure --azure-subscription my-sub-id --apply

Deploy to Cloudflare:

bash
clanker deploy https://github.com/user/repo --provider cloudflare --apply

Deploy to DigitalOcean:

bash
clanker deploy https://github.com/user/repo --provider digitalocean --do-token your-token --apply

Deploy to Hetzner:

bash
clanker deploy https://github.com/user/repo --provider hetzner --hetzner-token your-token --apply

Infrastructure Modification with Maker

Generate an infrastructure change plan without applying it:

bash
clanker ask "create an s3 bucket called my-data-bucket with versioning enabled" --maker

This outputs a JSON plan to stdout. To apply a previously generated plan:

bash
clanker ask --apply --plan-file my-plan.json

Or pipe directly:

bash
clanker ask "create an RDS postgres instance" --maker | clanker ask --apply

Use --destroyer to allow destructive operations (delete, terminate) in maker plans:

bash
clanker ask "delete the test-vpc and all associated resources" --maker --destroyer

Maker plans work across providers:

bash
clanker ask "create a cloudflare worker" --maker --cloudflare
bash
clanker ask "create a digitalocean droplet" --maker --digitalocean
bash
clanker ask "spin up one H100 in FIN-01 with my default ssh key" --maker --verda

Interactive Mode

Start a multi-turn conversation session:

bash
clanker talk

This launches the Hermes agent in interactive mode. The session maintains context across messages, so you can ask follow-up questions naturally:

you> show me my ec2 instances
hermes> [response with instance list]

you> which one has the highest CPU usage?
hermes> [response referencing the previous list]

you> exit
Goodbye.

Choosing an AI Provider

Override the default AI provider for a single command:

bash
clanker ask "list my s3 buckets" --aws --ai-profile anthropic

Override the model within a provider:

bash
clanker ask "show ec2 instances" --aws --openai-model gpt-5

IAM Security Analysis

Analyze IAM roles, policies, and permissions:

bash
clanker ask "analyze my IAM roles for overpermissive access" --iam

Scope analysis to a specific role or policy:

bash
clanker ask "what permissions does this role have?" --iam --role-arn arn:aws:iam::123456789012:role/my-role

Compliance Reporting

Generate comprehensive compliance documentation:

bash
clanker ask "generate compliance report" --compliance

Infrastructure Discovery

Run a comprehensive scan of all active AWS services:

bash
clanker ask "what is running in my infrastructure?" --discovery

Discovery mode automatically enables both AWS and Terraform contexts and scans all available services for active resources.

Debug Mode

Add --debug to any command to see diagnostic output, including which operations are being executed and how the AI is processing your question:

bash
clanker ask "show my lambda functions" --aws --debug

For detailed agent lifecycle logs during complex investigations:

bash
clanker ask "why are my lambda functions timing out?" --aws --agent-trace