{
"mcpServers": {
"mcp-server-ollama-deep-researcher": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
MIT. View license →
Is it maintained?
Last commit 48 days ago. 16 stars.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
An open-source AI agent that brings the power of Gemini directly into your terminal.
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for Mcp Server Ollama Deep Researcher and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
Ollama Deep Researcher is a Desktop Extension (DXT) that enables advanced topic research using web search and LLM synthesis, powered by a local MCP server. It supports configurable research parameters, status tracking, and resource access, and is designed for seamless integration with the DXT ecosystem.
.
├── manifest.json # DXT manifest (see MANIFEST.md for spec)
├── src/
│ ├── index.ts # MCP server entrypoint (Node.js, stdio transport)
│ └── assistant/ # Python research logic
│ └── run_research.py
├── README.md # This documentation
└── ...
Clone the repository and install dependencies:
git clone <your-repo-url>
cd mcp-server-ollama-deep-researcher
npm install
Install Python dependencies for the assistant:
cd src/assistant
pip install -r requirements.txt
# or use pyproject.toml/uv if preferred
Set required environment variables for web search APIs:
TAVILY_API_KEYPERPLEXITY_API_KEYEXA_API_KEY (Get yours at https://dashboard.exa.ai/api-keys)export TAVILY_API_KEY=your_tavily_key
export PERPLEXITY_API_KEY=your_perplexity_key
export EXA_API_KEY=your_exa_key
Build the TypeScript server (if needed):
npm run build
Run the extension locally for testing:
node dist/index.js
# Or use the DXT host to load the extension per DXT documentation
research tool with { "topic": "Your subject" }get_status toolconfigure tool with any of: maxLoops, llmModel, searchApiSee manifest.json for the full DXT manifest, including tool schemas and resource templates. Follows DXT MANIFEST.md.
stderr for debugging.TAVILY_API_KEY, PERPLEXITY_API_KEY, or EXA_API_KEY is set in your environment depending on which search API you're using.stderr.