MauricioPerera/n8n-a2e
1B-parameter models compose production n8n workflows at 70B-level quality via Context-Time Training (CTT). 4 LLM providers, feedback loop, plan normalizer, model evaluation framework.
Powered by [Context-Time Training (CTT)](https://github.com/MauricioPerera/repomemory-v2) — the production validation of entity-based memory architecture applied to n8n automation.
> Key finding: A 1B-parameter model achieves 86% deploy-ready workflows with three lightweight guard rails (feedback loop + plan normalizer + inline retry). Small models fail on format, not logic — structured error feedback closes the gap without fine-tuning or larger models.
n8n-a2e transforms natural language descriptions into fully valid n8n workflow JSON, deploys them to a running n8n instance via REST API, and learns from successful deployments to improve future compositions.
"Create a workflow that watches Slack and logs messages to Google Sheets"
↓
┌─────────────────┐
│ 1. RECALL │ TF-IDF search → relevant nodes + patterns
├─────────────────┤
│ 2. COMPOSE │ LLM generates WorkflowPlan → valid JSON
├─────────────────┤
│ 3. VALIDATE │ Check params, credentials, connections
├─────────────────┤
│ 4. DEPLOY │ POST to n8n REST API
├─────────────────┤
│ 5. LEARN │ Save as reusable pattern
└─────────────────┘
↓
Active workflow in n8ncrypto, fs, path, fetch)Loading reviews...