AI Agent with Architectural Memory. Impact analysis (free), tests and code from the graph (pro).
Config is the same across clients — only the file and path differ.
{
"mcpServers": {
"hokmah": {
"url": "https://hokmah.dev/mcp",
"type": "streamable-http"
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
AI Agent with Architectural Memory — MCP Server
No automated test available for this server. Check the GitHub README for setup instructions.
Five weighted categories — click any category to see the underlying evidence.
No known CVEs.
No package registry to scan.
Click any tool to inspect its schema.
Be the first to review
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Others in ai-ml
Dynamic problem-solving through sequential thought chains
A Model Context Protocol (MCP) server and CLI that provides tools for agent use when working on iOS and macOS projects.
The official Python SDK for Model Context Protocol servers and clients
An open-source AI agent that brings the power of Gemini directly into your terminal.
MCP Security Weekly
Get CVE alerts and security updates for io.github.davidangularme/hokmah and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
AI Agent with Architectural Memory — MCP Server
Gives any AI coding agent persistent understanding of codebases via TransitionGraph, IdeaGraph, and WorldModel. Analyze impact, generate tests, write code — all from the graph.
Add to your editor's MCP config (Cursor, Claude Code, VS Code, Windsurf, Cline, JetBrains):
{
"mcpServers": {
"hokmah": {
"type": "streamable-http",
"url": "https://hokmah.dev/mcp"
}
}
}
Then ask your agent: "analyze the impact of refactoring the auth module in github.com/owner/repo"
| Tool | Tier | Description |
|---|---|---|
hokmah_analyze | FREE | Impact analysis, risk score, affected files, architectural invariants |
hokmah_connect_project | FREE | Connect a GitHub repo, build the architectural graph |
hokmah_connect_mcp | FREE | Connect an external MCP server for orchestration |
hokmah_generate_tests | PRO | Test generation from the graph (40x fewer tokens) |
hokmah_generate_code | PRO | Code generation with architectural memory |
Hokmah builds a persistent architectural graph from your codebase:
When you ask "what's the impact of changing X?", Hokmah traverses the graph instead of sending your entire codebase to an LLM. That's why analyze is free (zero LLM tokens) and generate uses 40x fewer tokens.
hokmah_analyze + hokmah_connect_project + hokmah_connect_mcp (unlimited)hokmah_generate_tests + hokmah_generate_code (BYOK — bring your own LLM key)Get a Pro key at hokmah.dev.
~/Library/Application Support/Claude/claude_desktop_config.jsonclaude mcp add hokmah --transport streamable-http --url https://hokmah.dev/mcp.vscode/mcp.json in project root~/.windsurf/mcp.jsonThe hosted server at https://hokmah.dev/mcp is the recommended way to use Hokmah. To run the server yourself against your own Hokmah backend:
pip install -r requirements.txt
cp pro_keys.example.json pro_keys.json # edit with your real PRO keys
HOKMAH_API_BASE=http://localhost:8000 python mcp_server.py
Environment variables:
HOKMAH_API_BASE — upstream Hokmah API (default http://localhost:8000)HOKMAH_MCP_PORT — port to listen on (default 8001)HOKMAH_PRO_KEYS — path to the PRO keys JSON file (default /home/vpm/mcp-server/pro_keys.json)A reference systemd unit is provided in hokmah-mcp.service.
Catalyst AI Research · Haifa, Israel
MIT