An MCP server that provides access to the ColabFit database
{
"mcpServers": {
"colabfit-mcp": {
"args": [
"run",
"--rm",
"-i",
"server"
],
"command": "/path/to/colabfit-mcp/start.sh"
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
An MCP server that provides access to the ColabFit database
Is it safe?
No known CVEs for or.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Last commit 6 days ago. 2 stars.
Will it work with my client?
Transport: stdio, sse, http. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
Run this in your terminal to verify the server starts. Then let us know if it worked — your result helps other developers.
uvx 'or' 2>&1 | head -1 && echo "✓ Server started successfully"
After testing, let us know if it worked:
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Manage Supabase projects — databases, auth, storage, and edge functions
Query and manage PostgreSQL databases directly from AI assistants
An official Qdrant Model Context Protocol (MCP) server implementation
Context7 Platform -- Up-to-date code documentation for LLMs and AI code editors
MCP Security Weekly
Get CVE alerts and security updates for Colabfit MCP Server and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
An MCP server for discovering ColabFit datasets and training MACE interatomic potentials using KLIFF and KLAY.
This is a Model Context Protocol (MCP) server that gives AI assistants the ability to:
It bridges conversational AI and local compute — the AI agent searches for data, trains models, and runs simulations on your machine through this server.
For local (non-Docker) installation, only Python 3.10+ is required. See Local Installation.
git clone https://github.com/colabfit/colabfit-mcp.git
cd colabfit-mcp
# One-time setup: creates data directories and .env file
make setup
# Build Docker images with your user ID for proper permissions
make build
# Start with GPU detection (CUDA → CPU fallback)
make start
Run make help to see all available commands.
If you prefer not to use the Makefile:
cp example.env .env
# Edit .env to customize data directory location if desired
# Default location
mkdir -p ./colabfit_data/models ./colabfit_data/datasets ./colabfit_data/inference_output ./colabfit_data/test_driver_output
# Or custom location (must match COLABFIT_DATA_ROOT in .env)
# mkdir -p /your/custom/path/{models,datasets,inference_output,test_driver_output}
# This ensures the container user matches your host user and selects the right
# Dockerfile for your platform (CPU-only on macOS, GPU on Linux with NVIDIA)
USER_ID=$(id -u) GROUP_ID=$(id -g) ./start.sh build
start.sh automatically detects NVIDIA GPU availability and enables GPU passthrough when present, falling back to CPU otherwise.
Claude Code:
claude mcp add colabfit-mcp -- /path/to/colabfit-mcp/start.sh
Replace /path/to/colabfit-mcp with the absolute path to this repository.
Then restart Claude Code for the new server to take effect.
Claude Desktop:
Add to your Claude Desktop config (Settings > Developer > Edit Config):
{
"mcpServers": {
"colabfit-mcp": {
"command": "/path/to/colabfit-mcp/start.sh",
"args": ["run", "--rm", "-i", "server"]
}
}
}
OpenAI Agent (API-based, not ChatGPT app):
OpenAI agents that support MCP can connect to this server over stdio by launching the same command used above.
Use this command as the MCP server entrypoint:
/path/to/colabfit-mcp/start.sh
If your agent framework requires explicit command/args fields, use:
{
"command": "/path/to/colabfit-mcp/start.sh",
"args": ["run", "--rm", "-i", "server"]
}
Notes:
stdio MCP server registration in the same way as developer agent runtimes./path/to/colabfit-mcp with the absolute path to this repository.The server uses standard MCP stdio transport and works with any MCP-compatible client.
**En