Pick the best model per subtask, automatically. Plan with Opus, code with GPT-5, critique with Gemini — or any rule you write.
Dagrun routes your prompt across the models and local tools that actually get it done — then merges the results on your machine. Agentic. Local-first. Yours.
No single model is best at everything. Dagrun treats them like instruments — planning, coding, critiquing, retrieval — and splits the work accordingly, on your machine, with your tools, under your rules.
Pick the best model per subtask, automatically. Plan with Opus, code with GPT-5, critique with Gemini — or any rule you write.
Your files, your shell, your context. Dagrun reads and writes on your machine. Nothing is uploaded unless you send it.
Every tool call is logged with inputs, outputs, and the model that made it. Revoke any permission at any time.
Parallel fan-out across models and tools. Results merge locally. Finishes while others are still planning.
The prompt enters. The orchestra plays. The artifact lands in your filesystem. No black box, no unbounded agent loop.
Natural language. No slash commands, no templates to memorize. Dagrun reads it, then plans.
Plan in Opus. Research in Sonnet. Code-gen in GPT-5. Critique in Gemini. Tool calls to your shell, fs, git, browser — local.
Everything is synthesized locally. Files written where you said. Audit log stored on disk. You own every byte.
Dagrun runs as a native Mac app with a CLI twin. The plan, the filesystem access, the tool brokering, the audit trail — all on your machine. Model APIs are the only network calls, and only when you pick a cloud model.
A proper Mac window — unified toolbar, vibrancy sidebar, traffic lights where they belong. The agent graph, tool log, and command palette are first-class citizens, not a chat bubble in disguise.
Dagrun doesn’t resell inference. Paste your Anthropic key, your OpenAI key, your local model path — and route per step.
Tools are small TypeScript modules with a schema. Drop one in ~/dagrun/tools/ and it shows up in your next run.
Permissions are per-tool, per-flow. The agent never invokes anything you haven’t explicitly allowed.
You pay model providers directly. Dagrun doesn’t mark up inference — ever.
For individuals. Bring your own keys.
Small teams sharing routes and audit trails.
SOC2, SSO, on-prem routing. Ship an intern.