Run Core
Open source AI agent runtime.
Memory, context engineering, multi-agent orchestration — all on your machine.
One command. Any model. You decide what stays local.
npx @runcore-sh/runcore
curl -fsSL https://runcore.sh/install.sh | bash
Memory that persists
Episodic, semantic, and procedural memory in append-only JSONL. Your agent remembers decisions, learns from failures, and builds context across sessions. File-backed, searchable, yours.
Context engineering
Progressive disclosure loads only what's needed per turn. Skills route to modules. The assembler builds LLM-ready context from working memory, long-term memory, and brain files.
Multi-agent orchestration
Spawn agents with goals, constraints, and tool access. Action-based heartbeat — every agent action is the ping. Batch operations, continuation rounds, and a shared event bus.
Any model, local-first
Ollama, Claude, OpenAI, OpenRouter — switch freely. Data stays on your machine in plain files. PII redaction before anything leaves. Go fully local for zero exposure.
MCP tools built in
18 tools out of the box: memory, brain search, whiteboard, vouchers, alerts, open loops, web fetch. Expose via MCP server or HTTP — same tool, two surfaces.
Brain is just files
Markdown, YAML, JSONL — no database, no vendor format. Bring your own structure or start from the template. Back up to your cloud, version in git, read with any editor.