A multi-step LangChain v1 sales-conversation agent that uses the Azure OpenAI Responses API, an MCP server with Postgres + pgvector for grounding in catalog and action withCRM tools
Config is the same across clients — only the file and path differ.
{
"mcpServers": {
"langchain-agent-python": {
"command": "<see-readme>",
"args": []
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
A multi-step LangChain v1 sales-conversation agent that uses the Azure OpenAI Responses API, an MCP server with Postgres + pgvector for grounding in catalog and action withCRM tools
No automated test available for this server. Check the GitHub README for setup instructions.
Five weighted categories — click any category to see the underlying evidence.
No known CVEs.
No package registry to scan.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationBe the first to review
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Others in ai-ml
Workspace template + MCP server for Claude Code, Codex CLI, Cursor & Windsurf. Multi-agent knowledge engine (ag-refresh / ag-ask) that turns any codebase into a queryable AI assistant.
Persistent memory using a knowledge graph
Dynamic problem-solving through sequential thought chains
Privacy-first. MCP is the protocol for tool access. We're the virtualization layer for context.
MCP Security Weekly
Get CVE alerts and security updates for Langchain Agent Python and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
page_type: sample languages:
A Python sample that shows how to build a multi-step sales agent with LangChain v1 and Azure OpenAI that drive sales through a 6-step funnel using the handoffs pattern. The agent grounds it's responses in data stored in a Postgres database with pgvector for semantic search. The database is deployed as an Model Context Protocol (MCP) server, that exposes several tools the agent can use to quickly access data. Since the agent is using the Responses API, it can easily connect to MCP servers and comes with several build in tools like a code interpreter and image genration. Get started now.


gpt-5.4-mini will power the main agent and middleware tasks will use gpt-5-nano)text-embedding-3-small vectors for case studies, KB articles, and the product catalogue.search_case_studies, search_kb_articles, get_pricing, compare_plans — over streamable HTTP.The core LangChain Agent and the PostgreSQL MCP server are deployed independently as two Container Apps:

The agent is the only public-facing service. The MCP server is reachable only from inside the Container Apps environment. All Azure access uses a user-assigned managed identity with RBAC to Azure OpenAI and PostgreSQL.
Each step is a small system prompt plus a filtered tool subset. The agent moves between steps by calling state-mutating tools (set_intent, advance_to_step, back_to_greet, escalate_to_ae):
The state machine lives in agent/app/middleware/steps.py; the per-step prompts are plain text in agent/app/prompts/.
azd).The fastest path is to open the repo in GitHub Codespaces — every tool above is preinstalled.
Deploy the app to Azure:
az login
azd auth login
azd up
azd up provisions Azure OpenAI (wit