MCP server for soul.md — your portable AI identity layer across every LLM.
Config is the same across clients — only the file and path differ.
{
"mcpServers": {
"io-github-paperdavid-manoma": {
"command": "<see-readme>",
"args": []
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
MCP server for soul.md — your portable AI identity layer across every LLM.
No automated test available for this server. Check the GitHub README for setup instructions.
Five weighted categories — click any category to see the underlying evidence.
No known CVEs.
No package registry to scan.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationBe the first to review
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Others in other
Persistent memory using a knowledge graph
Privacy-first. MCP is the protocol for tool access. We're the virtualization layer for context.
Pre-build reality check. Scans GitHub, HN, npm, PyPI, Product Hunt — returns 0-100 signal.
Hash-verified file editing MCP server with token efficiency hook. 11 tools for AI coding agents.
MCP Security Weekly
Get CVE alerts and security updates for io.github.paperdavid/manoma and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
Your identity, not theirs. One file, every AI.
soul.md is an open format for your AI identity — values, voice, skills, taste. A plaintext file you own, version in git, and carry between tools. Manoma is the reference MCP server that makes any LLM read your soul.md.
See the soul.md format specification.
One line in your Claude Desktop, Cursor, or Zenflow config:
{
"mcpServers": {
"manoma": {
"command": "npx",
"args": ["-y", "manoma-mcp"]
}
}
}
Then drop your soul.md at ~/soul.md.
cp templates/founder.md ~/soul.md (also: engineer.md, designer.md, pm.md).Read — get_context(mode?), get_injection(mode?, message?), get_skill_depth(skill), list_sections().
Write-back — add_decision(...), add_lesson(...), update_now(...).
Resources expose every section as soul://section/<path>, plus soul://full for the whole file.
Full details in mcp/README.md and SPEC.md.
manoma/
├── mcp/ — MCP server (npx manoma-mcp)
├── templates/ — starter soul.md files
├── SPEC.md — format specification
└── README.md
Every AI vendor wants to own your memory. soul.md is the opposite bet: your identity as a plaintext file you control, interchangeable between tools, diff-able in git, outliving any individual model.
The schema is the thing. The runtime is just how it's read.
Contributions, forks, and parallel implementations welcome.
MIT.