LangENKO

`mandu ai chat`

Interactive REPL with streaming responses and slash commands. Works offline with the `local` echo provider, or hit Claude / OpenAI / Gemini / Ollama / LM-Studio.

since v0.25
On this page

mandu ai chat

Interactive REPL with streaming responses. Works offline with the local echo provider, or point it at Claude / OpenAI / Gemini / a local Ollama/LM-Studio.

# No API key — offline echo
mandu ai chat --provider=local

# Claude
export MANDU_CLAUDE_API_KEY="sk-ant-..."
mandu ai chat --provider=claude

# OpenAI with a specific model
export MANDU_OPENAI_API_KEY="sk-..."
mandu ai chat --provider=openai --model=gpt-4o

# Load a project-local system prompt
mandu ai chat --preset=mandu-conventions

Slash commands

Command Behavior
/help Print the command list (safe offline)
/reset Clear conversation history (system prompt kept)
/save <path> Dump history to JSON (schema v1)
/load <path> Restore history from JSON
/preset <name> Load docs/prompts/<name>.md as the system prompt
/system <path> Load an arbitrary file as the system prompt
/provider <name> Switch provider (claude / openai / gemini / local)
/model <id> Override model for the current provider
/quit (aliases: /exit, /bye) Graceful exit

History JSON schema (v1)

/save emits a self-describing file you can diff, share, or reload later:

{
  "version": 1,
  "provider": "claude|openai|gemini|local",
  "model": "optional-model-id",
  "system": "optional system prompt text",
  "savedAt": "2026-04-19T10:22:13.000Z",
  "messages": [
    { "role": "user",      "content": "hello" },
    { "role": "assistant", "content": "hi back" }
  ]
}

The CLI rejects files that don't match this shape with CLI_E302. The in-memory scrollback is bounded to 100 turns; older entries are dropped when the limit is exceeded.

Flags

Flag Default Notes
--provider local One of: claude, openai, gemini, local
--model provider-specific See PROVIDER_DEFAULT_MODEL in ai-client.ts
--system Absolute or CWD-relative file path
--preset Name of a docs/prompts/<name>.md file
--timeout 60000 ms Per-stream wall-clock budget
--help off Prints help without touching the network

Environment overrides

Variable Consumer Purpose
MANDU_AI_TIMEOUT_MS every provider Raises the default stream budget (CI may want 180_000)
MANDU_LOCAL_BASE_URL local provider Hit an OpenAI-compatible server (Ollama, LM-Studio)
MANDU_CLAUDE_API_KEY claude Anthropic key
MANDU_OPENAI_API_KEY openai OpenAI key
MANDU_GEMINI_API_KEY gemini Google key

Offline echo provider

The local provider ships a deterministic echo responder that works with no API key:

mandu ai chat --provider=local
> hello
< echo: hello
>

The echo response is a stable transform of the input — identical prompts produce identical outputs. Use it for:

  • CI smoke tests — diff-assert the output without flake.
  • Demos — no credentials required.
  • Offline flights — keep the chat surface exercised when disconnected.

For real local inference, point at Ollama / LM-Studio:

export MANDU_LOCAL_BASE_URL=http://127.0.0.1:11434
mandu ai chat --provider=local --model=llama3

Aborting a response

Hit Ctrl+C during an in-flight response. The stream disconnects cleanly, the failed turn is dropped from history, and the REPL returns to the prompt. Useful when a model is about to produce 4000 tokens you didn't want.

Security notes

  1. API keys are never printed. Errors mask them as sk-***.

  2. Slash-command args are escaped. /preset ../etc is rejected by a strict alphanumeric allow-list before the file open.

  3. Non-UTF8 / NUL-containing input is rejected with CLI_E308 so it never hits the adapter's HTTP body.

  4. Bun.secrets fallback. If the env var is absent, mandu ai reads from the OS keychain. Store keys with:

    bun -e "await Bun.secrets.set('MANDU_CLAUDE_API_KEY', 'sk-ant-...')"

Tips

  • Tail a long-running session. Open a second terminal and tail the save file:

    # In the REPL
    /save /tmp/history.json
    
    # In another terminal
    tail -f /tmp/history.json
  • Resume after disconnect. /load is self-describing — it restores provider + model + system + messages in one gesture.

  • Repo-aware Q&A. Combine /preset mandu-conventions with a strong model:

    /preset mandu-conventions
    /model claude-sonnet-4-0
    > Where does the guard preset `cqrs` live?
    

Troubleshooting

CLI_E300: API key missing — export the right MANDU_*_API_KEY for your provider.

CLI_E301: stream failed — check network / provider status. Most common cause: a proxy that's rewriting SSE headers.

CLI_E303: preset 'my-preset' not found — the preset loader looks for docs/prompts/<name>.md in your project root. Create that file or use --system <path> for arbitrary locations.

CLI_E307: timeout — raise MANDU_AI_TIMEOUT_MS or add --timeout=180000.

Garbled unicode in the REPL — Windows terminals need UTF-8 code page. Run chcp 65001 or use Windows Terminal.

🤖 Agent Prompt

🤖 Agent Prompt — `mandu ai chat`
Apply the guidance from the Mandu docs page at https://mandujs.com/docs/ai/chat to my project.

Summary of the page:
`mandu ai chat` is a streaming REPL. Slash commands: /help /reset /save /load /preset /system /provider /model /quit. History scrollback bounded to 100 turns. Default provider `local` works without API keys.

Required invariants — must hold after your changes:
- Default provider is `local` — works offline with deterministic echo
- In-memory scrollback bounded to 100 turns — older entries drop when exceeded
- History JSON schema v1 — `/save` / `/load` are portable across sessions
- Ctrl+C during streaming aborts the turn cleanly — failed turn is dropped from history
- Slash-command args are escaped: `/preset ../etc` rejected by strict alphanumeric allow-list

Then:
1. Make the change in my codebase consistent with the page.
2. Run `bun run guard` and `bun run check` to verify nothing
   in src/ or app/ breaks Mandu's invariants.
3. Show me the diff and any guard violations.

For Agents

{
  "schema": "mandu.ai.chat/v0.25",
  "command": "mandu ai chat",
  "providers": ["claude", "openai", "gemini", "local"],
  "default_provider": "local",
  "history_schema": "v1",
  "scrollback_limit_turns": 100,
  "slash_commands": [
    "/help", "/reset", "/save <path>", "/load <path>",
    "/preset <name>", "/system <path>",
    "/provider <name>", "/model <id>",
    "/quit", "/exit", "/bye"
  ],
  "ctrl_c_behavior": "aborts the current turn cleanly, drops it from history",
  "rules": [
    "Default provider is `local` — works offline, deterministic, no API key",
    "Slash args are allow-list escaped — path traversal blocked",
    "Keys can be stored in Bun.secrets as an env var fallback"
  ]
}

For Agents

AI hint

`mandu ai chat` is a streaming REPL. Slash commands: /help /reset /save /load /preset /system /provider /model /quit. History scrollback bounded to 100 turns. Default provider `local` works without API keys.

Invariants
  • Default provider is `local` — works offline with deterministic echo
  • In-memory scrollback bounded to 100 turns — older entries drop when exceeded
  • History JSON schema v1 — `/save` / `/load` are portable across sessions
  • Ctrl+C during streaming aborts the turn cleanly — failed turn is dropped from history
  • Slash-command args are escaped: `/preset ../etc` rejected by strict alphanumeric allow-list
Guard scope
ai