A rudimentary implementation of Anthropic's Model Context Protocol with OpenAIs Model
{
"mcpServers": {
"openai-mcp-client": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
This is a simple example of how to use the Model Context Protocol (MCP) with OpenAI's API to create a simple agent acting from a chat context. Feel free to use this as a starting point for your own projects.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Last commit 478 days ago. 54 stars.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
An open-source AI agent that brings the power of Gemini directly into your terminal.
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for Openai Mcp Client and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
This is a simple example of how to use the Model Context Protocol (MCP) with OpenAI's API to create a simple agent acting from a chat context. Feel free to use this as a starting point for your own projects.
deno install to install dependencies.env.example to .env and fill in the values
deno run dev to start the applicationChat messages are appended and currently the entire conversation is always sent to the server. This can rack up a lot of tokens and cost a lot of money, depending on the length of the conversation, the model you are using, and the size of the context.
This implementation currently only supports tool call responses of type text. Other resource can be implemented in applyToolCallIfExists in src/openai-utils.ts.
You latest messages are saved in messages.json for debugging purposes. These messages will be overwritten every time you run the application, so make sure to create a copy of the file before running the application again, if you want to keep the previous messages.
If you want to run the application in debug mode, set the DEBUG environment variable to true in your .env file. This will print out more information about the messages and tool calls.