Give your AI assistant access to real Helm chart data. No more hallucinated values.yaml files.
Config is the same across clients — only the file and path differ.
{
"mcpServers": {
"io-github-kubedoll-heavy-industries-helm-mcp": {
"command": "<see-readme>",
"args": []
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Give your AI assistant access to real Helm chart data. No more hallucinated values.yaml files.
No automated test available for this server. Check the GitHub README for setup instructions.
Five weighted categories — click any category to see the underlying evidence.
No known CVEs.
No package registry to scan.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationBe the first to review
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Others in ai-ml / cloud
Dynamic problem-solving through sequential thought chains
Manage Supabase projects — databases, auth, storage, and edge functions
A Model Context Protocol (MCP) server and CLI that provides tools for agent use when working on iOS and macOS projects.
MCP Server for GCP environment for interacting with various Observability APIs.
MCP Security Weekly
Get CVE alerts and security updates for io.github.Kubedoll-Heavy-Industries/helm-mcp and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
Give your AI assistant access to real Helm chart data. No more hallucinated values.yaml files.
When you ask Claude, Cursor, or other AI assistants to help with Kubernetes deployments, they don't have access to Helm chart schemas. So they guess — and the guesses look plausible but don't match reality.
Without mcp-helm:
With mcp-helm:
mcp-helm implements the Model Context Protocol (MCP) — a standard way for AI assistants to access external data sources.
Add this to your editor's MCP config to use our public instance (rate limited, no install required):
{
"mcpServers": {
"helm": {
"type": "http",
"url": "https://helm-mcp.kubedoll.com/mcp"
}
}
}
Then ask your AI: "What values can I configure for the bitnami/postgresql chart?"
Edit ~/.claude/mcp.json:
{
"mcpServers": {
"helm": {
"command": "docker",
"args": ["run", "--rm", "-i", "ghcr.io/kubedoll-heavy-industries/mcp-helm", "--transport=stdio"]
}
}
}
Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"helm": {
"command": "docker",
"args": ["run", "--rm", "-i", "ghcr.io/kubedoll-heavy-industries/mcp-helm", "--transport=stdio"]
}
}
}
Edit MCP settings in Cursor's configuration:
{
"mcpServers": {
"helm": {
"command": "docker",
"args": ["run", "--rm", "-i", "ghcr.io/kubedoll-heavy-industries/mcp-helm", "--transport=stdio"]
}
}
}
Add to your Continue config (~/.continue/config.json):
{
"experimental": {
"modelContextProtocolServers": [
{
"transport": {
"type": "stdio",
"command": "docker",
"args": ["run", "--rm", "-i", "ghcr.io/kubedoll-heavy-industries/mcp-helm", "--transport=stdio"]
}
}
]
}
}
If you prefer to run the binary directly, install mcp-helm and replace the Docker config with:
{
"mcpServers": {
"helm": {
"command": "mcp-helm"
}
}
}
| Tool | What it does |
|---|---|
search_charts | Search for charts in a Helm repository |
get_versions | Get available versions of a chart (newest first, use limit=1 for latest) |
get_values | Get chart values.yaml with optional JSON schema (include_schema=true) |
get_dependencies | Get chart dependencies from C |