AI Assistant¶
The TUI includes a built-in AI assistant with two modes: chat and code.
Chat Mode¶
Or: :chat
General-purpose AI assistant for exploring and understanding your cloud infrastructure. The AI can query your resources, explain configurations, and answer questions.
Code Mode¶
Or: :dev
Development-focused AI assistant with access to file system, git, shell, and Kubernetes tools. Use this for infrastructure-as-code workflows, debugging, and automation.
Configuration¶
API Key¶
Set your AI provider API key:
Model Selection¶
| Flag / Config | Default | Options |
|---|---|---|
--ai-model / ai-model | haiku | haiku, sonnet, opus, or a full model ID |
Safety Modes¶
| Mode | Description |
|---|---|
read-only (default) | AI can only read resources, no modifications |
non-destructive | AI can read and create/update, but not delete |
full | AI has full access to all operations |
Set via --ai-mode flag or ai-mode in config.
OpenAI-Compatible Backends¶
bnerd supports OpenAI-compatible backends (e.g., Ollama, vLLM):
ai-provider: openai
openai-base-url: http://localhost:11434
openai-model: llama3.2
openai-key: "" # May be optional for local backends
Token Budget¶
Limit tokens per AI session:
Set to 0 (default) for unlimited.
Shell Access¶
By default, the AI can only run commands from an allowlist. To enable arbitrary shell access:
Warning
Use --ai-shell-unrestricted only in trusted environments. It allows the AI to execute any shell command.
Session Persistence¶
AI sessions are automatically saved when you exit the TUI. To restore your last session on startup:
Chat and code modes maintain separate sessions.
Clearing History¶
In either AI mode, use :clear to reset the conversation history.
Tool Calls¶
The AI assistant uses tools to interact with your infrastructure. Tool calls are logged and visible in the chat view, showing:
- Tool name
- Parameters
- Status (running, success, error)
- Output (expandable)
The available tools depend on the safety mode — see MCP Server > Available Tools for the full catalog.