Connect AI models like Claude & GPT with robots using MCP and ROS.
{
"mcpServers": {
"robotmcp-client": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
The ROS MCP Client is a reference implementation of a Model Context Protocol (MCP) client, designed to connect directly with ros-mcp-server.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
Apache-2.0. View license →
Is it maintained?
Last commit 1 days ago. 12 stars.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
An open-source AI agent that brings the power of Gemini directly into your terminal.
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for Robotmcp_client and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
The ROS MCP Client is a reference implementation of a Model Context Protocol (MCP) client, designed to connect directly with ros-mcp-server.
Instead of using a Desktop LLM client, it acts as a bridge that integrates an LLM, enabling natural-language interaction with any ROS or ROS2 robot.
ros-mcp-client implements the LLM-side of the MCP protocol.
It can:
ros-mcp-server over MCP (stdio or HTTP).In short, it lets you run an MCP-compatible client that speaks to robots via the MCP interface — useful for testing, local reasoning, or autonomous AI controllers.
Implements MCP client specification — plug-and-play with the ROS MCP server.
ROS-aware LLM interface — specialized prompts and handlers for robotics tasks.
Supports bidirectional streaming — send commands, receive real-time topic feedback.
LLM integration ready — use Gemini, Anthropic, or Ollama APIs as reasoning engines.
Offline-capable — works entirely within local or LAN environments.
The MCP client is version-agnostic (ROS1 or ROS2).
rosbridgeros-mcp-server instancegit clone https://github.com/robotmcp/ros-mcp-client.git
cd ros-mcp-client
uv sync # or pip install -e .
Follow the setup guide for the Gemini Live client:
Start rosbridge on the target robot
ros2 launch rosbridge_server rosbridge_websocket_launch.xml
ros-mcp-client/
├── clients/
│ ├── gemini_live/ # Full-featured Gemini client
│ │ ├── gemini_client.py # Main client script
│ │ ├── mcp.json # MCP server configuration
│ │ ├── setup_gemini_client.sh # Automated setup
│ │ └── README.md # Detailed setup guide
├── config/ # Shared configuration
├── scripts/ # Utility scripts
├── pyproject.toml # Python dependencies
└── README.md # This file
The project includes a comprehensive LLM client implementation:
clients/gemini_live/)setup_gemini_client.sh# Try the Gemini Live client
cd clients/gemini_live
./setup_gemini_client.sh
uv run gemini_client.py
We welcome community PRs with new client implementations and integrations!
We love contributions of all kinds:
Ch