An MCP server written in Golang to interact with the Ollama API
{
"mcpServers": {
"gollama-mcp-server": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
An MCP server written in Golang to interact with the Ollama API
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
Apache-2.0. View license →
Is it maintained?
Last commit 95 days ago.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
An open-source AI agent that brings the power of Gemini directly into your terminal.
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for Gollama Mcp Server and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
An MCP (Model Context Protocol) server written in Golang to interact with the Ollama API. This server provides tools for AI assistants to interact with local Ollama models through the MCP protocol.
This MCP server provides the following tools for interacting with Ollama:
The server reads the Ollama host from the OLLAMA_HOST environment variable. If not set, it defaults to http://localhost:11434.
Example values:
export OLLAMA_HOST=http://localhost:11434
export OLLAMA_HOST=http://192.168.0.185:11434
export OLLAMA_HOST=http://spark.lan:11434
# Clone the repository
git clone <repository-url>
cd gollama-mcp-server
# Build the binary
go build -o bin/gollama-mcp-server ./cmd/main.go
Run the server in STDIO mode for use with MCP-compatible clients:
./bin/gollama-mcp-server
Run the server with HTTP transport on a specific port:
./bin/gollama-mcp-server -port 8080
You can also specify a custom host:
./bin/gollama-mcp-server -host 127.0.0.1 -port 8080
A pre-built Docker image is available from GitHub Container Registry:
docker pull ghcr.io/kevensen/gollama-mcp-server:latest
Alternatively, build the Docker image locally from the repository root:
docker build -t gollama-mcp-server .
Run the server in HTTP mode (port 8080) using the pre-built image:
docker run -p 8080:8080 ghcr.io/kevensen/gollama-mcp-server:latest
Or use your locally built image:
docker run -p 8080:8080 gollama-mcp-server
If your Ollama instance is running on the host machine, use host networking:
# Linux
docker run --network host ghcr.io/kevensen/gollama-mcp-server:latest
# macOS/Windows - use host.docker.internal
docker run -p 8080:8080 -e OLLAMA_HOST=http://host.docker.internal:11434 ghcr.io/kevensen/gollama-mcp-server:latest
To connect to a remote Ollama instance:
docker run -p 8080:8080 -e OLLAMA_HOST=http://your-ollama-host:11434 ghcr.io/kevensen/gollama-mcp-server:latest
Run in detached mode with automatic restart:
docker run -d --restart unless-stopped \
-p 8080:8080 \
-e OLLAMA_HOST=http://host.docker.internal:11434 \
--name gollama-mcp \
ghcr.io/kevensen/gollama-mcp-server:latest
{
"name": "listModels"
}
{
"name": "generate",
"arguments": {
"model": "llama3.2",
"prompt": "Write a haiku about programming",
"temperature": 0.7,
"stream": false
}
}
{
"name": "chat",
"arguments": {
"model": "llama3.2",
"messages": [
{"role": "user", "content": "Hello! How are you?"}
],
"temperature": 0.8,
"stream": false
}
}
{
"name": "pullModel",
"arguments": {
"model": "llama3.2",
"stream": true
}
}
{
"name": "embeddings",
"arguments": {
"model": "nomic-embed-text",
"input": "The quick brown fox jumps over the lazy dog"
}
}
gollama-mcp-server/
├── cmd/
│ └── main.go # Application entry point
├── internal/
│
... [View full README on GitHub](https://github.com/kevensen/gollama-mcp-server#readme)