Self-hosted memory for your AI assistants — 86% top-1 recall, zero external calls.
The problem
Every "memory" startup wants you to upload your most personal context to their servers. The selling point is convenience; the price is your data. There is no clean self-hosted primitive that fills the gap — you either run a Postgres-and-pgvector kit yourself or you give up and trust someone else's infra.
The proposal
memorystore is the missing primitive: one Docker compose, one SQLite file, one MCP server. Plug it into Claude Desktop, Cursor, or your own tooling. The data stays on your machine. The benchmark in this repo proves it: 86% top-1 recall and 96% top-5 on a 50-question evaluation set, 58ms p95 query latency, zero external network calls — verified by an audit log committed alongside the calibration runs.
Why now
MCP shipped. sqlite-vec shipped. Ollama is on every developer's laptop. Local embedding quality is good enough for personal-scale recall. The pieces all exist; this assembles them into a single primitive that survives r/selfhosted scrutiny.
Get notified when memorystore hits 1.0.
Who's behind this
Built by Aiden — solo developer working through a portfolio of AI primitives in 2026. memorystore is one of several exploratory bets; see the hub for the others. This is alpha — expect rough edges, and please file issues.