Causal graph memory engine for AI agents with scoring, activation, and forgetting
Config is the same across clients — only the file and path differ.
{
"mcpServers": {
"io-github-rishimeka-genesys-memory": {
"command": "<see-readme>",
"args": []
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Causal graph memory engine for AI agents with scoring, activation, and forgetting
No automated test available for this server. Check the GitHub README for setup instructions.
Five weighted categories — click any category to see the underlying evidence.
No known CVEs.
No package registry to scan.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationBe the first to review
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Others in ai-ml
A Model Context Protocol (MCP) server and CLI that provides tools for agent use when working on iOS and macOS projects.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for io.github.rishimeka/genesys-memory and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
The intelligence layer for AI memory.
Scoring engine + causal graph + lifecycle manager for AI agent memory. Speaks MCP natively.
![]()
Genesys is a scoring engine, causal graph, and lifecycle manager for AI memory. Memories are scored by a multiplicative formula (relevance × connectivity × reactivation), connected in a causal graph, and actively forgotten when they become irrelevant. It plugs into any storage backend and speaks MCP natively.
Your AI remembers everything but understands nothing. Genesys fixes that.
Most people should start with Option 1 (in-memory). If you want fully local with no API keys, jump to Option 3: Obsidian + local.
The fastest way to try Genesys. No database required — state is kept in memory and optionally persisted to a JSON file.
pip install genesys-memory
cp .env.example .env
# Set OPENAI_API_KEY in .env
uvicorn genesys.api:app --port 8000
To persist across restarts, set GENESYS_PERSIST_PATH in .env:
GENESYS_PERSIST_PATH=.genesys_state.json
Give this to Claude to set it up for you: "Install genesys-memory, create a .env with my OpenAI key, start the server on port 8000 with the in-memory backend, and connect it as an MCP server."
Persistent, scalable storage with vector search via pgvector.
pip install 'genesys-memory[postgres]'
cp .env.example .env
Edit .env:
OPENAI_API_KEY=sk-...
GENESYS_BACKEND=postgres
DATABASE_URL=postgresql://genesys:genesys@localhost:5432/genesys
Start Postgres and run migrations:
docker compose up -d postgres
alembic upgrade head
GENESYS_BACKEND=postgres uvicorn genesys.api:app --port 8000
Give this to Claude to set it up for you: "Install genesys-memory[postgres], start a Postgres container with pgvector using docker compose, run alembic migrations, create a .env with my OpenAI key and DATABASE_URL, start the server with GENESYS_BACKEND=postgres, and connect it as an MCP server."
Turns your Obsidian vault into a Genesys memory store. Markdown files become memory nodes, [[wikilinks]] become causal edges. A SQLite sidecar (.genesys/index.db) handles indexing.
pip install 'genesys-memory[obsidian]'
cp .env.example .env
Edit .env:
OPENAI_API_KEY=sk-...
GENESYS_BACKEND=obsidian
OBSIDIAN_VAULT_PATH=/path/to/your/vault
Start the server:
uvicorn genesys.api:app --port 8000
On first start, Genesys indexes all .md files in the vault and generates embeddings. A file watcher re-indexes incrementally when you edit notes.
If
OBSIDIAN_VAULT_PATHis not set, Genesys auto-detects by looking for.obsidian/in~/Documents/personal,~/Documents/Obsidian, and~/obsidian.