Wrap any CLI as a Model Context Protocol server - schema auto-inferred from --help.
Config is the same across clients — only the file and path differ.
{
"mcpServers": {
"io-github-ronieneubauer-cli2mcp": {
"command": "<see-readme>",
"args": []
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Wrap any CLI as a Model Context Protocol server - schema auto-inferred from --help.
No automated test available for this server. Check the GitHub README for setup instructions.
Five weighted categories — click any category to see the underlying evidence.
No known CVEs.
No package registry to scan.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationBe the first to review
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Others in ai-ml
Dynamic problem-solving through sequential thought chains
A Model Context Protocol (MCP) server and CLI that provides tools for agent use when working on iOS and macOS projects.
An open-source AI agent that brings the power of Gemini directly into your terminal.
The Apify MCP server enables your AI agents to extract data from social media, search engines, maps, e-commerce sites, or any other website using thousands of ready-made scrapers, crawlers, and automation tools available on the Apify Store.
MCP Security Weekly
Get CVE alerts and security updates for io.github.RonieNeubauer/cli2mcp and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
Status: v0.1 — early release. Stdio transport only. APIs may change before 1.0.
Expose any command-line binary as a Model Context Protocol tool by parsing its --help output and synthesizing a JSON Schema at startup. One command, no boilerplate.
Works with any MCP-compatible client — Claude Desktop, ChatGPT (via OpenAI Agents SDK), Cursor, Gemini CLI, Cline, Windsurf, Continue, Zed, and anything else that speaks the MCP stdio transport.
npx cli2mcp <command>
Writing an MCP server for a CLI you already have is mechanical work: instantiate the SDK, register a tool, hand-write the input schema, marshal arguments, spawn the subprocess, format the output. Roughly 80–150 lines of TypeScript per binary, repeated forever as new tools come out.
cli2mcp does it in one command. The CLI's own --help is the source of truth for the schema — if rg adds a flag tomorrow, the AI sees it tomorrow without code changes.
npm install -g cli2mcp
# or invoke without installing
npx cli2mcp <command>
Requires Node.js 22+.
cli2mcp is launched by your client as a stdio subprocess. Add an entry per CLI you want to expose.
Config file location:
| OS | Path |
|---|---|
| macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
| Windows | %APPDATA%\Claude\claude_desktop_config.json |
| Linux | ~/.config/Claude/claude_desktop_config.json |
{
"mcpServers": {
"ripgrep": {
"command": "npx",
"args": ["-y", "cli2mcp", "rg", "--name", "ripgrep"]
},
"jq": {
"command": "npx",
"args": ["-y", "cli2mcp", "jq"]
}
}
}
Restart Claude Desktop after editing.
| Client | Config file | Format |
|---|---|---|
| ChatGPT (OpenAI Agents SDK) | MCPServerStdio parameter — see OpenAI Agents docs | command: "npx", args: ["-y", "cli2mcp", "<cli>"] |
| Cursor | .cursor/mcp.json (project) or ~/.cursor/mcp.json (global) | Same mcpServers block as above |
| Cline | VS Code → Cline → MCP Settings → cline_mcp_settings.json | Same mcpServers block |
| Windsurf | ~/.codeium/windsurf/mcp_config.json | Same mcpServers block |
| Gemini CLI | ~/.gemini/settings.json | Same mcpServers block |
| Continue | ~/.continue/config.json → experimental.modelContextProtocolServers | Same launcher |
| Zed | ~/.config/zed/settings.json → context_servers | Same launcher |
| Any stdio-capable MCP client | per the client's docs | Same launcher: npx -y cli2mcp <command> |
Refer to each client's documentation for the exact config path on your platform — they evolve and are not guaranteed to match the table above.
Drop any of these into your client's mcpServers block (paths shown above per client). Each one wraps a popular CLI as an MCP tool an AI can call directly.
{
"mcpServers": {
"ripgrep": {
"command": "npx",
"args": ["-y", "cli2mcp", "rg", "--name", "ripgrep",
"--description", "Recursively search files with regex"]
},
"jq": {
"com
... [View full README on GitHub](https://github.com/RonieNeubauer/cli2mcp#readme)