CTXONE // memory for agents

Stop re-explaining your project every morning.

You know the feeling. Third day in a row, you paste the same paragraph about your stack. You budget tokens for memory the model should already have. That low-grade dread has a name: context anxiety.

CTXone is the memory layer that cures it — stateful across sessions, sharable across tools, inspectable with git-style blame.

$ curl -sSL https://raw.githubusercontent.com/ctxone/ctxone/main/install.sh | sh

BSL-1.1 → Apache 2.0 Self-hosted MCP native Zero telemetry

CONTEXT_ANXIETY.md

Three flavors of context anxiety.

Everyone using AI coding tools feels all three, even if they've never named them. CTXone's three pillars map one-to-one to the three versions of the pain.

01 // STATEFUL

Yesterday's context is gone.

You told Claude on Tuesday that you're using SQLite, not Postgres, and why. Wednesday morning you open a new chat and it's asking about your "Postgres schema" again. You paste the same paragraph. You budget tokens for memory the model should already have.

CTXone remembers. Every fact survives sessions, branches, and tool switches because it's stored in a local graph, not in the model's context window.

02 // SHARABLE

Your teammate is flying blind.

You primed your Cursor install with the team's architectural decisions. Priya didn't. Now she's arguing with the model about whether to use Redis for the job queue — a question you settled three months ago in Slack and she never saw.

CTXone is shared. The graph is a file. Commit it, sync it, mount it over the team's lab network. Whatever you primed is what everyone else sees.

03 // INSPECTABLE

Who told the model that?

You open a file Claude edited last week and there's a comment about "the new API versioning policy." You don't remember a policy. Nobody on the team remembers a policy. Did the model hallucinate it? Did you mention it once in a side chat? Nobody can check. The decision is orphaned.

CTXone is accountable. Every commit carries an agent ID, a timestamp, an intent, and reasoning. ctx blame shows you exactly who wrote what, when, and why — git-style.

Read the full context anxiety argument →

THREE_VERBS //

Three verbs. That's the whole product.

CTXone boils down to three things an AI tool does to memory. Everything else — branches, priming, session stats, provenance — falls out of these.

01 // REMEMBER

Remember

Write a fact once. It survives sessions, branches, and tool switches. Importance maps to a confidence score so high-value facts aren't drowned out by chatter.

$ ctx remember "BSL-1.1 for all new repos" \
    --importance high --context licensing

# from an LLM:
remember(fact="BSL-1.1 for all new repos",
         importance="high", context="licensing")
02 // RECALL

Recall

Ask a topic. Get pinned context plus relevant facts, capped at a token budget. Every response includes the live savings ratio so the claim is provable in real time.

$ ctx recall "licensing" --budget 1500

# from an LLM:
recall(topic="licensing", budget=1500)
03 // BLAME

Blame

Every commit carries an agent ID, a timestamp, an intent, and optional reasoning. ctx blame traces a fact back to the tool, user, and session that wrote it.

$ ctx blame /memory/legal/bsl

# who wrote it, when, and why:
agent:  claude-code
when:   2026-03-14T09:17:23Z
intent: Observe
reason: user preference
LIVE // ctx_savings_ratio

5× on day one. More as your graph grows.

Every recall response includes a _ctxone_stats field showing exactly how many tokens you sent vs how many you would have sent if the model had seen the full memory graph. This isn't marketing — it's the live delta the Hub computes on every request.

Flat memory
What the model sees without CTXone
1,500 tokens
Every session, every time
CTXone recall
What the model sees with CTXone
300 tokens
Topic-matched + pinned
savings ratio (day one)

Shown: a fresh graph with a few dozen facts. As you write more, recall stays the same size but the baseline grows — the ratio climbs to 10×, 20×, and higher on graphs that have been lived in for a week or more. Start at 5×, discover the rest. See the math →

Works with the tools you already use

CTXone exposes MCP for AI coding tools, native plugins for chat UIs, and direct client libraries for everything else. ctx init auto-detects and wires them in one command.

Install it. Use it. See the ratio climb.

There is no signup. No server to rent. No SaaS bill. Just a binary that runs on your laptop and a graph file under ~/.ctxone.

$ curl -sSL https://raw.githubusercontent.com/ctxone/ctxone/main/install.sh | sh

Source-available under BSL-1.1. Every release converts to Apache-2.0 four years after it ships — full story.