Persistent memory for any AI — zero token cost until recall
{
"mcpServers": {
"io-github-alphaonedev-ai-memory": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Persistent memory for any AI — zero token cost until recall
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Commit history unknown.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Persistent memory using a knowledge graph
Privacy-first. MCP is the protocol for tool access. We're the virtualization layer for context.
Pre-build reality check. Scans GitHub, HN, npm, PyPI, Product Hunt — returns 0-100 signal.
Monitor browser logs directly from Cursor and other MCP compatible IDEs.
MCP Security Weekly
Get CVE alerts and security updates for io.github.alphaonedev/ai-memory and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
_
__ _(_) _ __ ___ ___ _ __ ___ ___ _ __ _ _
/ _` | |___ | '_ ` _ \ / _ \ '_ ` _ \ / _ \| '__| | | |
| (_| | |___| | | | | | | __/ | | | | | (_) | | | |_| |
\__,_|_| |_| |_| |_|\___|_| |_| |_|\___/|_| \__, |
universal AI memory |___/
ai-memory is a persistent memory system for AI assistants. It works with any AI that supports MCP -- Claude, ChatGPT, Grok, Llama, and more. It stores what your AI learns in a local SQLite database, ranks memories by relevance when recalling, and auto-promotes important knowledge to permanent storage. Install it once, and every AI assistant you use remembers your architecture, your preferences, your corrections -- forever.
Zero token cost until recall. Unlike built-in memory systems (Claude Code auto-memory, ChatGPT memory) that load your entire memory into every conversation -- burning tokens and money on every message -- ai-memory uses zero context tokens until the AI explicitly calls memory_recall. Only relevant memories come back, ranked by a 6-factor scoring algorithm. TOON format (Token-Oriented Object Notation) cuts response tokens by another 40-60% by eliminating repeated field names -- 3 memories in JSON = 1,600 bytes; in TOON = 626 bytes (61% smaller); in TOON compact = 336 bytes (79% smaller). For Claude Code users: disable auto-memory ("autoMemoryEnabled": false in settings.json) and replace it with ai-memory to stop paying for 200+ lines of memory context on every single message.
ai-memory integrates with any AI platform that supports the Model Context Protocol (MCP). MCP is the universal standard for connecting AI assistants to external tools and data sources.
| Platform | Integration Method | Config Format | Status |
|----------|-------------------|---------------|--------|
| Claude Code (Anthropic) | MCP stdio | JSON (~/.claude.json or .mcp.json) | Fully supported |
| Codex CLI (OpenAI) | MCP stdio | TOML (~/.codex/config.toml) | Fully supported |
| Gemini CLI (Google) | MCP stdio | JSON (~/.gemini/settings.json) | Fully supported |
| Grok (xAI) | MCP remote HTTPS | API-level | Fully supported |
| Cursor IDE | MCP stdio | JSON (~/.cursor/mcp.json) | Fully supported |
| Windsurf (Codeium) | MCP stdio | JSON (~/.codeium/windsurf/mcp_config.json) | Fully supported |
| Continue.dev | MCP stdio | YAML (~/.continue/config.yaml) | Fully supported |
| Llama Stack (META) | MCP remote HTTP | YAML / Python SDK | Fully supported |
| OpenClaw | MCP stdio | JSON (mcp.servers in config) | Fully supported |
| Any MCP client | MCP stdio or HTTP | Varies | Universal |
MCP is the primary integration layer. For AI platforms that do not yet support MCP natively, the HTTP API (20 endpoints on localhost) and the CLI (25 commands) provide universal access -- any AI, script, or automation that can make HTTP calls or run shell commands can use ai-memory.
Pre-built binaries require no dependencies. Building from source needs Rust and a C compiler.
Fastest: Pre-built binary (no Rust required)
# macOS / Linux
curl -fsSL https://raw.githubusercontent.com/alphaonedev/ai-memory-mcp/main/install.sh | sh
# Ubuntu (PPA)
sudo add-apt-repository ppa:jbridger2021/ai-memory && sudo apt i
... [View full README on GitHub](https://github.com/alphaonedev/ai-memory-mcp#readme)