MCP server for searching, exploring, and aggregating Disney Lorcana cards.
{
"mcpServers": {
"io-github-danielenricocahall-lorcana-mcp": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
MCP server for searching, exploring, and aggregating Disney Lorcana cards.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Last commit 5 days ago. 1 stars.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Persistent memory using a knowledge graph
Privacy-first. MCP is the protocol for tool access. We're the virtualization layer for context.
Pre-build reality check. Scans GitHub, HN, npm, PyPI, Product Hunt — returns 0-100 signal.
Monitor browser logs directly from Cursor and other MCP compatible IDEs.
MCP Security Weekly
Get CVE alerts and security updates for io.github.danielenricocahall/lorcana-mcp and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
An MCP server for searching and aggregating Disney Lorcana cards.
On startup, the server fetches all cards from lorcanajson.org via a GET request to https://lorcanajson.org/files/current/en/allCards.json.
Cards are kept in-memory as a Python list for fast filtering. With ~2,700 cards this is lightweight and requires no external database. A local JSON file cache (LORCANA_CACHE_PATH, default cards.json) lets the server skip the API fetch on subsequent startups.
Startup data loading is controlled by:
LORCANA_REFRESH_ON_STARTUP:
true: always fetch from API and repopulate storagefalse: use existing cache if availableLORCANA_SKIP_IF_DB_EXISTS:
true (default): skip API fetch if the cache file already contains cardsfalse: fetch and repopulateThe server is published to GHCR and the MCP Registry. Pull and run it directly:
docker pull ghcr.io/danielenricocahall/lorcana-mcp:latest
docker run --rm -i ghcr.io/danielenricocahall/lorcana-mcp:latest
To persist the card cache across container restarts, mount a volume:
docker run --rm -i \
-e LORCANA_CACHE_PATH=/data/cards.json \
-e LORCANA_SKIP_IF_DB_EXISTS=true \
-v lorcana_mcp_data:/data \
ghcr.io/danielenricocahall/lorcana-mcp:latest
uv run python main.py
docker build -t lorcana-mcp:latest .
docker run --rm -i lorcana-mcp:latest
docker compose build
docker compose run --rm -T lorcana-mcp
Notes:
LORCANA_API (default: https://lorcanajson.org/files/current/en/allCards.json)LORCANA_CACHE_PATH (default: cards.json) — local file for caching fetched cardsLORCANA_HTTP_TIMEOUT_SECONDS (default: 60)LORCANA_REFRESH_ON_STARTUP (false default)LORCANA_SKIP_IF_DB_EXISTS (true default){
"mcpServers": {
"lorcana": {
"command": "uv",
"args": ["run", "python", "/absolute/path/to/lorcana-mcp/main.py"]
}
}
}
{
"mcpServers": {
"lorcana": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"ghcr.io/danielenricocahall/lorcana-mcp:latest"
]
}
}
}
{
"mcpServers": {
"lorcana": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"lorcana-mcp:latest"
]
}
}
}
{
"mcpServers": {
"lorcana": {
"command": "docker",
"args": ["compose", "run", "--rm", "-T", "lorcana-mcp"]
}
}
}
claude mcp add --scope user \
-- lorcana docker run --rm -i \
ghcr.io/danielenricocahall/lorcana-mcp:latest
claude mcp add --scope user \
-- lorcana docker run --rm -i lorcana-mcp:latest
Once connected to an MCP client, you can ask natural language questions like:
Card lookup
Deck building