Run Core

Open source AI agent runtime.
Memory, context engineering, multi-agent orchestration — all on your machine.
One command. Any model. You decide what stays local.

Install & run
npx @runcore-sh/runcore
Requires Node.js 18+
Or install everything automatically:
curl -fsSL https://runcore.sh/install.sh | bash
Installs Node.js + Ollama + pulls a model + starts Core. One line.

Memory that persists

Episodic, semantic, and procedural memory in append-only JSONL. Your agent remembers decisions, learns from failures, and builds context across sessions. File-backed, searchable, yours.

Context engineering

Progressive disclosure loads only what's needed per turn. Skills route to modules. The assembler builds LLM-ready context from working memory, long-term memory, and brain files.

Multi-agent orchestration

Spawn agents with goals, constraints, and tool access. Action-based heartbeat — every agent action is the ping. Batch operations, continuation rounds, and a shared event bus.

Any model, local-first

Ollama, Claude, OpenAI, OpenRouter — switch freely. Data stays on your machine in plain files. PII redaction before anything leaves. Go fully local for zero exposure.

MCP tools built in

18 tools out of the box: memory, brain search, whiteboard, vouchers, alerts, open loops, web fetch. Expose via MCP server or HTTP — same tool, two surfaces.

Brain is just files

Markdown, YAML, JSONL — no database, no vendor format. Bring your own structure or start from the template. Back up to your cloud, version in git, read with any editor.

What happens

npx @runcore-sh/runcore Browser opens Name your agent It remembers, learns, acts