Bridge Claude with OpenAI Codex CLI for AI-to-AI collaboration, code review, and second opinions
{
"mcpServers": {
"io-github-lykhoyda-ask-codex": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Bridge Claude with OpenAI Codex CLI for AI-to-AI collaboration, code review, and second opinions
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Commit history unknown.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
| Package | Type | Version | Downloads |
|---|---|---|---|
ask-gemini-mcp | MCP Server | ||
ask-codex-mcp | MCP Server | ||
ask-ollama-mcp | MCP Server | ||
ask-llm-mcp | MCP Server | ||
@ask-llm/plugin | Claude Code Plugin | /plugin install |
MCP servers + Claude Code plugin for AI-to-AI collaboration
MCP servers that bridge your AI client with multiple LLM providers for AI-to-AI collaboration. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's 1M+ token context, Codex's GPT-5.4, or local Ollama models — all via standard MCP.
# All-in-one — auto-detects installed providers
claude mcp add --scope user ask-llm -- npx -y ask-llm-mcp
claude mcp add --scope user gemini -- npx -y ask-gemini-mcp
claude mcp add --scope user codex -- npx -y ask-codex-mcp
claude mcp add --scope user ollama -- npx -y ask-ollama-mcp
Add to claude_desktop_config.json:
{
"mcpServers": {
"ask-llm": {
"command": "npx",
"args": ["-y", "ask-llm-mcp"]
}
}
}
{
"mcpServers": {
"gemini": {
"command": "npx",
"args": ["-y", "ask-gemini-mcp"]
},
"codex": {
"command": "npx",
"args": ["-y", "ask-codex-mcp"]
},
"ollama": {
"command": "npx",
"args": ["-y", "ask-ollama-mcp"]
}
}
}
Cursor (.cursor/mcp.json):
{
"mcpServers": {
"ask-llm"
... [View full README on GitHub](https://github.com/Lykhoyda/ask-llm#readme)
No automated test available for this server. Check the GitHub README for setup instructions.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationNo known vulnerabilities.
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol (MCP) server and CLI that provides tools for agent use when working on iOS and macOS projects.
The official Python SDK for Model Context Protocol servers and clients
An open-source AI agent that brings the power of Gemini directly into your terminal.
MCP Security Weekly
Get CVE alerts and security updates for io.github.Lykhoyda/ask-codex and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.