A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.
{
"mcpServers": {
"mcp-client-for-ollama": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
MIT. View license →
Is it maintained?
Last commit 8 days ago. 589 stars.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
Last scanned 1h ago
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.