Streamline your workflow with Felix. Integrate it into your workspace and tailor its behavior to y…
{
"mcpServers": {
"ai-smithery-felixyifeiwang-felix-mcp-smithery": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Streamline your workflow with Felix. Integrate it into your workspace and tailor its behavior to y…
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Last commit 188 days ago.
Will it work with my client?
Transport: . Compatibility not confirmed.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
XcodeBuildMCP provides tools for Xcode project management, simulator management, and app utilities.
An open-source AI agent that brings the power of Gemini directly into your terminal.
The full-stack TypeScript framework to build, test, and deploy production-ready MCP servers and AI-native apps.
Open-source persistent memory for AI agent pipelines (LangGraph, CrewAI, AutoGen) and Claude. REST API + knowledge graph + autonomous consolidation.
MCP Security Weekly
Get CVE alerts and security updates for ai.smithery/FelixYifeiWang-felix-mcp-smithery and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
A tiny Model Context Protocol server with a few useful tools, deployed on Smithery, tested in Claude Desktop, and indexed in NANDA.
Tools included
hello(name) – quick greetingrandomNumber(max?) – random integer (default 100)weather(city) – current weather via wttr.insummarize(text, maxSentences?, model?) – OpenAI-powered summary (requires OPENAI_API_KEY)Public server page
https://smithery.ai/server/@FelixYifeiWang/felix-mcp-smithery
MCP endpoint (streamable HTTP)
https://server.smithery.ai/@FelixYifeiWang/felix-mcp-smithery/mcp
(In Smithery/NANDA, auth is attached via query param api_key and optional profile, configured in the platform UI; do not hardcode secrets here.)
Open Settings → Developer → mcpServers and add:
{
"mcpServers": {
"felix-mcp-smithery": {
"command": "npx",
"args": [
"-y",
"@smithery/cli@latest",
"run",
"@FelixYifeiWang/felix-mcp-smithery",
"--key",
"YOUR_SMITHERY_API_KEY",
"--profile",
"YOUR_PROFILE_ID"
]
}
}
}
Start a new chat and run:
{ "name": "Felix" }”StreamableHTTPServerTransport on /mcp (POST/GET/DELETE).Mcp-Session-Id (no close recursion).gpt-4o-mini).Requires Node 18+ (tested on Node 20).
git clone https://github.com/FelixYifeiWang/felix-mcp-smithery
cd felix-mcp-smithery
npm install
Set env (only needed if you’ll call summarize locally):
export OPENAI_API_KEY="sk-..."
Run:
node index.js
# ✅ MCP Streamable HTTP server on 0.0.0.0:8081 (POST/GET/DELETE /mcp)
Local curl:
curl -s -X POST "http://localhost:8081/mcp" \
-H 'Content-Type: application/json' \
-H 'Mcp-Protocol-Version: 2025-06-18' \
--data '{"jsonrpc":"2.0","id":0,"method":"initialize","params":{"protocolVersion":"2025-06-18"}}'
hello
{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"hello","arguments":{"name":"Felix"}}}
randomNumber
{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"randomNumber","arguments":{"max":10}}}
weather
{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"weather","arguments":{"city":"Boston"}}}
summarize (needs OPENAI_API_KEY set on the server)
{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"summarize","arguments":{"text":"(paste long text)","maxSentences":2}}}
Server core:
McpServer from @modelcontextprotocol/sdk with tools registered in buildServer().
Transport: StreamableHTTPServerTransport on /mcp handling:
POST /mcp — JSON-RPC requests (and first-time initialize)GET /mcp — server-to-client notifications (SSE)DELETE /mcp — end sessionCORS: Allows all origins; exposes Mcp-Session-Id header (good for hosted clients).
OpenAI summarize: Thin fetch wrapper around /v1/chat/completions with a short “crisp summarizer” system prompt.
GitHub repo with:
index.js (Express + MCP)package.json (@modelcontextprotocol/sdk, express, cors, zod)Dockerfilesmithery.yaml: