Example of integrating MCP with openai chat completions to use tools from MCP server.
{
"mcpServers": {
"mcp-chatwithtools": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Example of integrating MCP with openai chat completions to use tools from MCP server.
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
MIT. View license →
Is it maintained?
Last commit 0 days ago.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Dynamic problem-solving through sequential thought chains
A Model Context Protocol server for searching and analyzing arXiv papers
An open-source AI agent that brings the power of Gemini directly into your terminal.
The official Python SDK for Model Context Protocol servers and clients
MCP Security Weekly
Get CVE alerts and security updates for Mcp Chatwithtools and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
A learning project demonstrating how to integrate the Model Context Protocol (MCP) into chat applications. This repository shows practical patterns for connecting OpenAI's LLM with MCP servers to enable dynamic tool discovery and execution.
This project demonstrates how to integrate MCP servers so they can be called as tools in an application using OpenAI-compatible API calls that take an array of tool definitions.
The architecture shows how to cleanly separate concerns when integrating MCP into LLM applications, making it easy to add new tools without modifying application code.
uv package manager (recommended) or pip# Install dependencies
uv sync
The main example demonstrating the "tools with chat" pattern:
# Set your OpenAI API key (also supports .env)
export OPENAI_API_KEY="your-api-key-here"
# Optional set alternative API base URL
export OPENAI_BASE_URL="your-alt-url/v1 here"
# Run with default model (gpt-4o-mini)
python chatwithtools.py mcp.json
# Or specify a different model
python chatwithtools.py mcp.json gpt-4o

chatwithtools.py demonstrates how to integrate MCP (Model Context Protocol) into a "tools with chat" application. It shows a clean architectural pattern with two key components:
get_tools() function to format MCP tools for OpenAI's function calling APIThis pattern enables an OpenAI LLM to dynamically discover and call tools hosted on any MCP server without hardcoding tool definitions. The system consists of four main participants:
The ChatSession class implements the "tools with chat" pattern, orchestrating the complete conversation flow with OpenAI:
Responsibilities:
MCPToolExecutor.initialize_tools() to get MCP tools formatted for OpenAItool_calls arrayKey Methods:
initialize() - Loads MCP tools via tool_executor.initialize_tools() for the tools arraysend_message(user_message) - Orchestrates the full chat completion cycle including tool callsrun() - Interactive command-line loopKey Pattern:
# Phase 1: Chat with tools array
response = openai.chat.completions.create(
model=self.model,
messages=self.messages,
tools=self.tools, # Formatted by MCPToolExecutor
tool_choice="auto"
)
# Phase 2: Execute tools if requested
if response.tool_calls:
... [View full README on GitHub](https://github.com/oshea00/mcp-chatwithtools#readme)