An MCP server that provides tools for exploring large OpenAPI schemas
{
"mcpServers": {
"openapi-mcp-proxy": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
An MCP server that provides tools for exploring large OpenAPI schemas
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
MIT. View license →
Is it maintained?
Last commit 185 days ago. 14 stars.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Persistent memory using a knowledge graph
Privacy-first. MCP is the protocol for tool access. We're the virtualization layer for context.
Pre-build reality check. Scans GitHub, HN, npm, PyPI, Product Hunt — returns 0-100 signal.
Monitor browser logs directly from Cursor and other MCP compatible IDEs.
MCP Security Weekly
Get CVE alerts and security updates for Openapi Mcp Proxy and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
An MCP server that provides tools for exploring large OpenAPI schemas without loading entire schemas into LLM context. Perfect for discovering and analyzing endpoints, data models, and API structure efficiently.

macOS/Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
Using pip:
pip install uv
git clone https://github.com/nyudenkov/openapi-mcp-proxy.git
cd openapi-mcp-proxy
uv sync
# Test that the server starts correctly
uv run python main.py
The server should start without errors.
uv run python main.py
The server runs using stdio and integrates with MCP-compatible LLM clients.
add_api: Add a new API configuration with name, URL and optional description
name (required): Short name for the APIurl (required): URL to the OpenAPI scheme (yaml/json)description (optional): Optional descriptionheaders (optional): Optional HTTP headers for authentication (e.g., {'Authorization': 'Bearer token', 'X-API-Key': 'key'})list_saved_apis: List all saved API configurations
remove_api: Remove a saved API configuration
get_api_info: Get general information about an API
list_endpoints: List all endpoints in an API with pagination and filtering
search_endpoints: Search endpoints by query with pagination and filtering
get_endpoint_details: Get detailed information about a specific endpoint
list_models: List all data models in an API with pagination and filtering
get_model_schema: Get detailed schema for a specific model
All listing tools (list_endpoints, search_endpoints, list_models) support pagination to handle large APIs efficiently:
Tools are capable to filter results to find exactly what you need:
Endpoint Filtering:
Model Filtering:
API configurations are automatically saved to api_configs.json in the working directory. The file structure:
{
"apis": {
"api-name": {
"name": "some-project-local-backend",
"url": "http://127.0.0.1:8000/openapi.json",
"description": "Optional description for some cool project local
... [View full README on GitHub](https://github.com/nyudenkov/openapi-mcp-proxy#readme)