Prompt templates
Mandu's prompt template system — `docs/prompts/*.md` files loaded via `--preset` or `/preset`, plus the provider-specific adapter architecture behind `mandu ai`.
On this page
Prompt templates
Mandu's prompt template system lets you version system prompts as
ordinary Markdown files under docs/prompts/, load them by name,
and ship them with your repo so every teammate + every CI job uses
the same instructions.
# Load a preset as the system prompt (chat)
mandu ai chat --preset=mandu-conventions
# Or via slash command mid-session
/preset mandu-conventions
# Eval across providers with the same preset
mandu ai eval --preset=mandu-conventions \
--prompt="What is the guard preset for CQRS?" \
--providers=claude,openai
File layout
docs/
└── prompts/
├── mandu-conventions.md # repo-specific rules
├── reviewer.md # PR reviewer persona
├── system.md # base system prompt
└── phase-testing.md # task-specific preamble
Preset names are [a-zA-Z0-9_-]+ — no slashes, no dots, no tilde.
/preset ../etc is rejected by the strict allow-list before any file
open. This is the same escape hatch that blocks traversal in all
other slash commands.
Preset shape
Presets are plain Markdown with optional frontmatter:
---
name: mandu-conventions
version: 1.0.0
audience: AI Agents
last_verified: 2026-04-18
---
# Mandu Conventions Reference
Concrete, code-oriented reference for the three core building blocks.
## 1. Slots (Server-side data loaders)
...
Frontmatter is parsed but not currently surfaced — it's useful for
mandu ai eval batch runs where you want to pin a preset to a known
version in CI.
How the preset becomes the system prompt
When you pass --preset=X (or run /preset X):
1. Validate `X` against the allow-list `[a-zA-Z0-9_-]+`.
2. Resolve to `docs/prompts/<X>.md` relative to project root.
3. Read the file (Bun.file).
4. Strip frontmatter (optional).
5. Replace the in-memory system prompt wholesale.
6. Subsequent user turns are sent with this as the system message.
The preset never "merges" with a previously-loaded system — each
--preset / /preset swap is a complete replacement.
The adapter architecture
Behind mandu ai chat and mandu ai eval is a small adapter layer
that normalizes Mandu's provider-agnostic shape into each provider's
wire protocol:
┌───────────────────────────┐
│ Mandu-normalized messages│
│ { role, content }[] │
└──────────┬────────────────┘
│
┌───────┴──────────────┐
│ AiClient.stream() │
└───────┬──────────────┘
│
┌──────────┴───────────────┬───────────────┬────────────┐
│ │ │ │
▼ ▼ ▼ ▼
Claude adapter OpenAI adapter Gemini adapter Local adapter
(SSE /messages) (SSE /chat/ (SSE /generate (deterministic
completions) Content) echo + OpenAI-
compatible)
Each adapter:
- Accepts
{ system, messages, model, timeoutMs }. - Translates to the provider's request body.
- Streams tokens back as
AsyncIterable<string>. - Masks API keys in every error path (
sk-***constant). - Reports
usagecounts when the provider returns them.
The local adapter
local is special — it speaks the OpenAI-compatible protocol
(/v1/chat/completions), so it works with:
- A deterministic echo responder when
MANDU_LOCAL_BASE_URLis unset. - Ollama via
http://127.0.0.1:11434/v1(setMANDU_LOCAL_BASE_URL). - LM-Studio via its local OpenAI-compatible endpoint.
- Any other self-hosted OpenAI-compatible server.
This means "run the eval on a local model" is a one-env-var change — no new adapter required.
Writing a good preset
Rules of thumb for presets that ship with a repo:
- State the role. "You are a senior engineer working inside a Mandu project."
- Enumerate non-negotiable rules. Use a numbered / tagged list so the model can reference specific rules.
- Ship concrete code examples. The adapter layer passes the preset verbatim — a code block in the preset becomes a code block in the prompt context.
- Add "don't" items. Negative examples bind the model more reliably than positive ones alone.
- Version the preset. When a preset changes, bump
versionin frontmatter so CI diff tests catch accidental drift.
See docs/prompts/mandu-conventions.md in the Mandu source tree as
a reference implementation.
Ship-with-your-app presets
A growing project typically has multiple presets for different workflows:
| Preset | Used by |
|---|---|
system.md |
Base system prompt — inherited by most chats |
mandu-conventions.md |
Repo-specific rules (slots, islands, contracts) |
phase-testing.md |
Agents running testing tasks |
phase-auth.md |
Agents wiring auth flows |
reviewer.md |
PR review persona — rigid, concise |
loop-closure.md |
Post-stall guidance (used by mandu.loop.close) |
Teammates and CI just reference preset names; the file contents stay versioned in git.
Custom system prompts without a preset
For one-off prompts you don't want to commit:
# From a file outside docs/prompts/
mandu ai chat --system=./scratch/my-system.md
# Or inline (one-liner)
echo "You are a concise reviewer." \
| mandu ai eval --system=/dev/stdin --prompt="PR #123 diff…"
--system <path> accepts any readable file path — no allow-list
beyond the filesystem's own read permission check. Use --preset
for versioned, shared prompts; --system for ad-hoc work.
Common errors
CLI_E303: preset 'foo' not found — the preset loader looks for
docs/prompts/foo.md in your project root. Create the file or use
--system <path> for arbitrary locations.
CLI_E304: invalid preset name — names must match
[a-zA-Z0-9_-]+. Slashes, dots, tilde are rejected to prevent path
traversal.
Model ignores the preset — some smaller local models weight the most-recent turn more heavily than the system message. Try moving the critical rules into the first user turn, or switch to a larger model.
🤖 Agent Prompt
Apply the guidance from the Mandu docs page at https://mandujs.com/docs/ai/prompts to my project.
Summary of the page:
Prompt presets live at `docs/prompts/<name>.md`. Loaded by `mandu ai chat --preset=<name>` / `mandu ai eval --preset=<name>` / `/preset <name>` slash command. Name is allow-listed to `[a-zA-Z0-9_-]+` — path traversal blocked. Adapter layer: each provider maps Mandu-normalized messages → provider-specific SSE format.
Required invariants — must hold after your changes:
- Preset names are allow-listed to `[a-zA-Z0-9_-]+` — slashes, dots, tilde rejected at parse
- Preset lookup path is exactly `docs/prompts/<name>.md` relative to the project root
- Frontmatter is optional but recommended — `name`, `version`, `audience`, `last_verified`
- Adapter layer normalizes messages into a provider-agnostic shape before streaming
- System prompt is separate from user messages — never concatenated into the first user turn
Then:
1. Make the change in my codebase consistent with the page.
2. Run `bun run guard` and `bun run check` to verify nothing
in src/ or app/ breaks Mandu's invariants.
3. Show me the diff and any guard violations.
Related
- AI — Chat — slash commands including
/preset. - AI — Eval — non-interactive preset usage.
- AI — Loop closure — how
mandu.loop.closecomposes anextPrompt.
For Agents
{
"schema": "mandu.ai.prompts/v0.25",
"preset_path": "docs/prompts/<name>.md",
"preset_name_allowlist": "[a-zA-Z0-9_-]+",
"frontmatter": {
"name": "string",
"version": "semver string",
"audience": "string",
"last_verified": "ISO date"
},
"adapter_architecture": {
"claude": "SSE /messages",
"openai": "SSE /chat/completions",
"gemini": "SSE /generateContent",
"local": "OpenAI-compatible — deterministic echo OR Ollama/LM-Studio via MANDU_LOCAL_BASE_URL"
},
"replacement_semantics": "each --preset swap is a complete replacement, not a merge",
"rules": [
"Preset names never contain slashes, dots, or tilde — allow-list blocks path traversal",
"Version presets in frontmatter — CI diff tests catch drift",
"`--system <path>` for ad-hoc; `--preset <name>` for shipped, versioned prompts"
]
}For Agents
Prompt presets live at `docs/prompts/<name>.md`. Loaded by `mandu ai chat --preset=<name>` / `mandu ai eval --preset=<name>` / `/preset <name>` slash command. Name is allow-listed to `[a-zA-Z0-9_-]+` — path traversal blocked. Adapter layer: each provider maps Mandu-normalized messages → provider-specific SSE format.
- Preset names are allow-listed to `[a-zA-Z0-9_-]+` — slashes, dots, tilde rejected at parse
- Preset lookup path is exactly `docs/prompts/<name>.md` relative to the project root
- Frontmatter is optional but recommended — `name`, `version`, `audience`, `last_verified`
- Adapter layer normalizes messages into a provider-agnostic shape before streaming
- System prompt is separate from user messages — never concatenated into the first user turn