This MCP server provides tools for listing and retrieving content from different knowledge bases.
{
"mcpServers": {
"knowledge-base-mcp-server": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
This MCP server provides tools for listing and retrieving content from different knowledge bases.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
Unlicense. View license →
Is it maintained?
Last commit 108 days ago. 43 stars.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
A Model Context Protocol server for searching and analyzing arXiv papers
📖 MCP server for fetch deepwiki.com and get latest knowledge in Cursor and other Code Editors
Model Context Protocol (MCP) Server to connect your AI with any MediaWiki
Open source implementation and extension of Google Research’s PaperBanana for automated academic figures, diagrams, and research visuals, expanded to new domains like slide generation.
MCP Security Weekly
Get CVE alerts and security updates for Knowledge Base Mcp Server and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
This MCP server provides tools for listing and retrieving content from different knowledge bases.
These instructions assume you have Node.js and npm installed on your system.
To install Knowledge Base Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @jeanibarz/knowledge-base-mcp-server --client claude
Prerequisites
Clone the repository:
git clone <repository_url>
cd knowledge-base-mcp-server
Install dependencies:
npm install
Configure environment variables:
This server supports three embedding providers: Ollama (recommended for reliability), OpenAI and HuggingFace (fallback option).
EMBEDDING_PROVIDER=ollama to use local Ollama embeddingsollama pull dengcao/Qwen3-Embedding-0.6B:Q8_0EMBEDDING_PROVIDER=ollama
OLLAMA_BASE_URL=http://localhost:11434 # Default Ollama URL
OLLAMA_MODEL=dengcao/Qwen3-Embedding-0.6B:Q8_0 # Default embedding model
KNOWLEDGE_BASES_ROOT_DIR=$HOME/knowledge_bases
EMBEDDING_PROVIDER=openai to use OpenAI API for embeddingsEMBEDDING_PROVIDER=openai
OPENAI_API_KEY=your_api_key_here
OPENAI_MODEL_NAME=text-embedding-ada-002
KNOWLEDGE_BASES_ROOT_DIR=$HOME/knowledge_bases
EMBEDDING_PROVIDER=huggingface or leave unset (default)EMBEDDING_PROVIDER=huggingface # Optional, this is the default
HUGGINGFACE_API_KEY=your_api_key_here
HUGGINGFACE_MODEL_NAME=sentence-transformers/all-MiniLM-L6-v2
KNOWLEDGE_BASES_ROOT_DIR=$HOME/knowledge_bases
FAISS_INDEX_PATH environment variable to specify the path to the FAISS index. If not set, it will default to $HOME/knowledge_bases/.faiss.LOG_FILE=/path/to/logs/knowledge-base.log. Log verbosity defaults to info and can be adjusted with LOG_LEVEL=debug|info|warn|error..bashrc or .zshrc file, or directly in the MCP settings.Build the server:
npm run build
Add the server to the MCP settings:
Edit the cline_mcp_settings.json file located at /home/jean/.vscode-server/data/User/globalStorage/saoudrizwan.claude-dev/settings/.
Add the following configuration to the mcpServers object:
Option 1: Ollama Configuration
"knowledge-base-mcp-ollama": {
"command": "node",
"args": [
"/path/to/knowledge-base-mcp-server/build/index.js"
],
"disabled": false,
"autoApprove": [],
"env": {
"KNOWLEDGE_BASES_ROO