Persistent memory and knowledge graph for AI assistants — keyword + vector + graph search.
{
"mcpServers": {
"cc-contexta-contexta-mcp": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Persistent memory and knowledge graph for AI assistants — keyword + vector + graph search.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Commit history unknown.
Will it work with my client?
Transport: . Compatibility not confirmed.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
An open-source AI agent that brings the power of Gemini directly into your terminal.
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for cc.contexta/contexta-mcp and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
Persistent memory & knowledge graph for your AI assistant.
Keyword + vector + graph search in a single lookup.
Contexta is a memory MCP server that remembers like a human does — by meaning, by words, and by relationships.
Most memory tools give you one flavor of recall: embedding search, a raw log, or a graph you pay extra for. Contexta fuses all three into a single query. Your agent can ask "what did we decide about pricing with Acme last quarter?" and get back the right meeting, the right decision, and the people connected to it — in one round-trip.
Under the hood, Contexta indexes your notes, messages, and documents as typed entities (people, projects, decisions, meetings, tasks) and links them into a knowledge graph automatically. Every retrieval combines:
One-click install from the Smithery listing — Smithery handles the OAuth flow and proxies connections through its gateway.
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):
{
"mcpServers": {
"contexta": {
"url": "https://mcp.contexta.cc/mcp"
}
}
}
Claude will walk you through the OAuth sign-in on first use. See examples/claude-desktop.json.
Add a new MCP server in Cursor → Settings → MCP:
https://mcp.contexta.cc/mcpSee examples/cursor.json.
In ChatGPT → Settings → Connectors → Add custom connector → paste https://mcp.contexta.cc/mcp. See examples/chatgpt.md for full instructions.
Contexta uses OAuth 2.0 with dynamic client registration (RFC 7591) and PKCE. On first connection, your MCP client opens a browser window, you sign in with your Contexta account, and the client receives a per-user access token. Tokens are refreshed automatically.
No API keys, no shared credentials — every request runs as a specific user with that user's private memory.
Contexta MCP exposes the following tools to your AI client (non-exhaustive):
search — keyword + vector + graph s