Unified MCP server for querying OpenTelemetry traces across multiple backends (Jaeger, Tempo, Traceloop, etc.), enabling AI agents to analyze distributed traces for automated debugging and observability.
{
"mcpServers": {
"opentelemetry-mcp": {
"env": {
"BACKEND_URL": "http://localhost:16686",
"BACKEND_TYPE": "jaeger"
},
"args": [
"opentelemetry-mcp"
],
"command": "uvx"
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Unified MCP server for querying OpenTelemetry traces across multiple backends (Jaeger, Tempo, Traceloop, etc.), enabling AI agents to analyze distributed traces for automated debugging and observability.
Is it safe?
No known CVEs for opentelemetry-mcp.
No authentication — any process on your machine can connect.
Apache-2.0. View license →
Is it maintained?
Last commit 5 days ago. 181 stars.
Will it work with my client?
Transport: stdio, sse, http. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
Run this in your terminal to verify the server starts. Then let us know if it worked — your result helps other developers.
uvx 'opentelemetry-mcp' 2>&1 | head -1 && echo "✓ Server started successfully"
After testing, let us know if it worked:
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
An open-source AI agent that brings the power of Gemini directly into your terminal.
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for Opentelemetry Mcp Server and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
Query and analyze LLM traces with AI assistance. Ask Claude to find expensive API calls, debug errors, compare model performance, or track token usage—all from your IDE.
An MCP (Model Context Protocol) server that connects AI assistants to OpenTelemetry trace backends (Jaeger, Tempo, Traceloop), with specialized support for LLM observability through OpenLLMetry semantic conventions.
See it in action:
https://github.com/user-attachments/assets/e2106ef9-0a58-4ba0-8b2b-e114c0b8b4b9
No installation required! Configure your client to run the server directly from PyPI:
// Add to claude_desktop_config.json:
{
"mcpServers": {
"opentelemetry-mcp": {
"command": "pipx",
"args": ["run", "opentelemetry-mcp"],
"env": {
"BACKEND_TYPE": "jaeger",
"BACKEND_URL": "http://localhost:16686"
}
}
}
}
Or use uvx (alternative):
{
"mcpServers": {
"opentelemetry-mcp": {
"command": "uvx",
"args": ["opentelemetry-mcp"],
"env": {
"BACKEND_TYPE": "jaeger",
"BACKEND_URL": "http://localhost:16686"
}
}
}
}
That's it! Ask Claude: "Show me traces with errors from the last hour"
# Run without installing (recommended)
pipx run opentelemetry-mcp --backend jaeger --url http://localhost:16686
# Or with uvx
uvx opentelemetry-mcp --backend jaeger --url http://localhost:16686
This approach:
Configure the MCP server in your Claude Desktop config file:
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.jsonUsing pipx (recommended):
{
"mcpServers": {
"opentelemetry-mcp": {
"command": "pipx",
"args": ["run", "opentelemetry-mcp"],
"env": {
"BACKEND_TYPE": "jaeger",
"BACKEND_URL": "http://localhost:16686"
}
}
}
}
Using uvx (alternative):
{
"mcpServers": {
"opentelemetry-mcp": {
"command": "uvx",
"args": ["opentelemetry-mcp"],
"env": {
"BACKEND_TYPE": "jaeger",
"BACKEND_URL": "http://localhost:16686"
}
}
}
}
For Traceloop backend:
{
"mcpServers": {
"opentelemetry-mcp": {
"command": "pipx",
"args": ["run", "opentelemetry-mcp"],
"env": {
"BACKEND_TYPE": "traceloop",
"BACKEND_URL": "https://api.traceloop.com",
"BACKEND_API_KEY": "your_traceloop_api_key_here"
}
}
}
}
If you're developing locally with the cloned repository, use one of these configurations:
Option 1: Wrapper script (easy backend switching)
{
"mcpServers": {
"opentelemetry-mcp": {
"command": "/absolute/path/to/opentelemetry-mcp-server/start_locally.sh"
}
}
}
Option 2: UV directly (for multiple backends)
{
"mcpServers": {
"opentelemetry-mcp-jaeger": {
"command": "uv",
... [View full README on GitHub](https://github.com/traceloop/opentelemetry-mcp-server#readme)