Hanz74/brix
Generic process orchestrator for Claude Code — combine Python, HTTP, CLI and MCP building blocks into pipelines with unified JSON interface
Platform-specific configuration:
{
"mcpServers": {
"brix": {
"command": "npx",
"args": [
"-y",
"brix"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
A skill runtime for Claude Code. Turn multi-step workflows into reusable slash commands backed by real pipelines — not prompt chains.
Brix combines modular building blocks (Python, HTTP, CLI, MCP) into pipelines with a unified JSON interface. Each pipeline can be exposed as a Claude Code custom slash command — giving Claude powerful, token-efficient skills that run as native processes.
AI coding assistants like Claude Code are powerful — but they're wasteful at multi-step workflows. Consider a real task: "Download the last 50 invoice PDFs from my Outlook."
Without Brix, Claude does this:
1. MCP: list-mail-messages (filter=Rechnung) → wait, context grows
2. MCP: list-mail-attachments (mail #1) → wait, context grows
3. MCP: list-mail-attachments (mail #2) → wait, context grows
... 48 more times ...
50. MCP: list-mail-attachments (mail #50) → wait, context grows
51. MCP: get-mail-attachment (attachment #1) → wait, context grows
... for every attachment ...
Python: save file, generate reportThat's ~164 tool calls. Each one sends the entire conversation context back and forth. At ~4,000 tokens per round-trip, that's ~656,000 tokens consumed — for what is essentially a batch job.
With Brix, Claude does this:
brix run download-attachments-broad.yaml \
-p keywords="Rechnung,Invoice" -p top=200 \
-p output_dir=/host/root/dev/invoicesOne call. 6.7 seconds. 79 PDFs. 17 MB on disk.
We tested two strategies against a real Microsoft 365 mailbox:
| Strategy | What it does | API calls | Time | Result | |----------|-------------|-----------|------|--------| | Without Brix | Claude makes each call individually | ~164 | ~10 min+ | Fragile, context overflow risk | | Targeted | ODa
Loading reviews...