{
"mcpServers": {
"artifex-mcp-server": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
A Model Context Protocol server for Artifex.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
MIT. View license →
Is it maintained?
Last commit 144 days ago.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
An open-source AI agent that brings the power of Gemini directly into your terminal.
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for Artifex Mcp Server and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
⚠️ Disclaimer: this project is FOR EDUCATION, NOT PRODUCTION* ️
This project is an example of Model Context Protocol server, which allows a Large Language Model to interact with Artifex Engine.
To build the server, execute:
cargo build
This MCP server can be used with Ollama via:
Follow the instructions to install ollama on a GNU/Linux system.
Then, run Ollama:
ollama serve
Install ollama-mcp-bridge using uv Python package manager:
uv tool install ollama-mcp-bridge
Then, run the bridge:
PATH=$PATH:${PWD}/target/debug ollama-mcp-bridge \
--config ${PWD}/data/ollama-mcp-bridge/mcp-config.json \
--host 0.0.0.0 --port 8000
Use curl to interact using the chat API. For example:
curl -N -X POST http://localhost:8000/api/chat \
-H "accept: application/json" \
-H "Content-Type: application/json" \
-d '{
"model": "qwen3:0.6b",
"messages": [
{
"role": "system",
"content": "You are an Artifex Engine assistant."
},
{
"role": "user",
"content": "Give me the result of the inspection."
}
],
"think": false,
"stream": false
}'
🔧 The configuration file data/ollama-mcp-bridge/mcp-config.json must be
modified.
Change:
"../../data/samples/config.toml"
To:
"./data/samples/config.toml"
Install ollama-mcp-client using uv:
uv tool install ollmcp
Then, start the client:
PATH=$PATH:${PWD}/target/debug ollmcp --model qwen3:0.6b \
--servers-json ${PWD}/data/ollama-mcp-bridge/mcp-config.json
Use the tools command to list the available tools: "artifex-engine" should be
listed. Enable it and start chatting.
Copyright © 2025 Eric Le Bihan
This program is distributed under the terms of the MIT License.
See the LICENSE-MIT file for license details.