An MCP server that gives LLMs persistent, searchable semantic memory
{
"mcpServers": {
"io-github-daedalus-mcp-external-memory": {
"args": [
"mcp-external-memory"
],
"command": "uvx"
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
An MCP server that gives LLMs persistent, searchable semantic memory
Is it safe?
No known CVEs for mcp-external-memory.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Last commit 12 days ago.
Will it work with my client?
Transport: sse, http. Compatibility not confirmed.
Context cost
7 tools. ~300 tokens (0.1% of 200K).
Run this in your terminal to verify the server starts. Then let us know if it worked — your result helps other developers.
uvx 'mcp-external-memory' 2>&1 | head -1 && echo "✓ Server started successfully"
After testing, let us know if it worked:
No known vulnerabilities.
memory_storePersist text + optional namespace/tags/metadata
memory_searchSemantic search (cosine similarity) over all memories
memory_getRetrieve a single memory by ID
memory_deleteDelete a memory by ID
memory_listList memories with optional namespace/tag filter + pagination
memory_statsCount of memories, namespaces, DB path
memory_updateUpdate an existing memory
This server is missing a description.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Persistent memory using a knowledge graph
Privacy-first. MCP is the protocol for tool access. We're the virtualization layer for context.
Pre-build reality check. Scans GitHub, HN, npm, PyPI, Product Hunt — returns 0-100 signal.
Monitor browser logs directly from Cursor and other MCP compatible IDEs.
MCP Security Weekly
Get CVE alerts and security updates for External Memory MCP Server and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
An MCP server that gives LLMs persistent, searchable semantic memory.
pip install mcp-external-memory
from mcp_external_memory import memory_store, memory_search
# Store a memory
result = memory_store(content="Alice prefers dark mode", namespace="users", tags=["alice", "ui"])
# Search memories
results = memory_search(query="what does Alice prefer?", namespace="users")
mcp-external-memory --help
| Tool | Description |
|------|-------------|
| memory_store | Persist text + optional namespace/tags/metadata |
| memory_search | Semantic search (cosine similarity) over all memories |
| memory_get | Retrieve a single memory by ID |
| memory_delete | Delete a memory by ID |
| memory_list | List memories with optional namespace/tag filter + pagination |
| memory_stats | Count of memories, namespaces, DB path |
| memory_update | Update an existing memory |
The server supports multiple embedding backends:
text-embedding-3-small modelSet via MEMORY_EMBED_BACKEND environment variable.
git clone https://github.com/daedalus/mcp-external-memory.git
cd mcp-external-memory
pip install -e ".[test]"
# run tests
pytest
# format
ruff format src/ tests/
# lint
ruff check src/ tests/
# type check
mypy src/
mcp-name: io.github.daedalus/mcp-external-memory