Skip to content

Hermes Agent

Hermes is the default conversational agent in Clanker. It is built on the NousResearch/hermes-agent project and provides multi-turn conversations with tool use capabilities through a Python bridge process.

Prerequisites

Install the Hermes agent from the Clanker repository root:

bash
make setup-hermes

This clones the hermes-agent repository into vendor/hermes-agent/, creates a Python virtual environment, and installs all dependencies. Requirements:

  • Python 3.11 or later
  • uv package manager (curl -LsSf https://astral.sh/uv/install.sh | sh)

Usage

Single-Shot Queries

bash
clanker ask --agent hermes "what EC2 instances are running?"

Interactive Conversations

bash
clanker talk --agent hermes

Since Hermes is the default agent for talk, you can also just run:

bash
clanker talk

Debug Mode

bash
clanker ask --agent hermes --debug "analyze my infrastructure"

Debug output shows tool calls ([tool: ...]) and reasoning steps ([thinking: ...]) alongside the conversation.

Configuration

Add the Hermes section to your ~/.clanker.yaml:

yaml
hermes:
  path: "/path/to/vendor/hermes-agent"   # auto-detected if omitted
  model: "gpt-4o"                         # model name
  base_url: "https://api.openai.com/v1"   # API endpoint

AI Provider

Hermes uses the AI provider configured in your ~/.clanker.yaml under ai.default_provider. The bridge process receives the appropriate API key and base URL as environment variables.

Supported providers:

ProviderConfig ValueNotes
OpenAIopenaiUses OPENAI_API_KEY
AnthropicanthropicUses ANTHROPIC_API_KEY
AWS BedrockbedrockUses AWS profile credentials
OpenRouter(any)Set hermes.base_url to https://openrouter.ai/api/v1 and provide hermes.openrouter_api_key

Model Selection

The model is determined by the hermes.model config value. Common choices:

yaml
# OpenAI models
hermes:
  model: "gpt-4o"
  base_url: "https://api.openai.com/v1"

# Via OpenRouter (access to 200+ models)
hermes:
  model: "anthropic/claude-sonnet-4"
  base_url: "https://openrouter.ai/api/v1"
  openrouter_api_key: "sk-or-..."

How It Works

When you use --agent hermes, Clanker:

  1. Locates the hermes-agent directory (config, cwd, or executable-relative)
  2. Spawns the Python bridge process using the hermes venv's Python interpreter
  3. Sends an initialize handshake via JSON-RPC 2.0 over stdin/stdout
  4. Forwards your prompt with optional AWS infrastructure context
  5. Streams events back: text chunks (message_delta), tool calls (tool_call), reasoning (thought), and the final result

For talk mode, the bridge stays alive across multiple prompts, maintaining conversation history internally.

Troubleshooting

"hermes-agent not found"

Run the setup from the clanker repository root:

bash
make setup-hermes

"max_tokens" error with newer models

Some newer OpenAI models (gpt-5.x) require max_completion_tokens instead of max_tokens. The hermes bridge currently sends max_tokens. Use gpt-4o or an older model until the hermes bridge is updated.

Authentication errors with OpenRouter

Make sure hermes.openrouter_api_key is set in your config or the OPENROUTER_API_KEY environment variable is exported.

See Also