Validate and test llguidance grammars with batch testing and documentation
{
"mcpServers": {
"grammar-tools": {
"args": [
"guidance-lark-mcp"
],
"type": "local",
"tools": [
"*"
],
"command": "uvx"
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Validate and test llguidance grammars with batch testing and documentation
Is it safe?
No known CVEs for guidance-lark-mcp.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Last commit 34 days ago.
Will it work with my client?
Transport: stdio, http. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
Run this in your terminal to verify the server starts. Then let us know if it worked — your result helps other developers.
uvx 'guidance-lark-mcp' 2>&1 | head -1 && echo "✓ Server started successfully"
After testing, let us know if it worked:
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Persistent memory using a knowledge graph
Privacy-first. MCP is the protocol for tool access. We're the virtualization layer for context.
Pre-build reality check. Scans GitHub, HN, npm, PyPI, Product Hunt — returns 0-100 signal.
Monitor browser logs directly from Cursor and other MCP compatible IDEs.
MCP Security Weekly
Get CVE alerts and security updates for Guidance Lark MCP Server and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
MCP server for validating and testing llguidance grammars (Lark format). Provides grammar validation, batch test execution, and syntax documentation — ideal for iteratively building grammars with AI coding assistants.
uvx guidance-lark-mcp
pip install guidance-lark-mcp
cd mcp-grammar-tools
pip install -e .
You can add the server using the interactive /mcp add command or by editing the config file directly. See the Copilot CLI MCP documentation for full details.
Option 1: Interactive setup
In the Copilot CLI, run /mcp add, select Local/STDIO, and enter uvx guidance-lark-mcp as the command.
Option 2: Edit config file
Add the following to ~/.copilot/mcp-config.json:
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp"],
"tools": ["*"]
}
}
}
This gives you grammar validation and batch testing out of the box. To also enable LLM-powered generation (generate_with_grammar), add ENABLE_GENERATION and your credentials to env:
"env": {
"ENABLE_GENERATION": "true",
"OPENAI_API_KEY": "your-key-here"
}
For Azure OpenAI (with Entra ID via az login), use guidance-lark-mcp[azure] and set the endpoint instead:
"args": ["guidance-lark-mcp[azure]"],
"env": {
"ENABLE_GENERATION": "true",
"AZURE_OPENAI_ENDPOINT": "https://your-resource.openai.azure.com/",
"OPENAI_MODEL": "your-deployment-name"
}
See Backend Configuration for all supported backends.
After saving, use /mcp show to verify the server is connected.
{
"mcpServers": {
"grammar-tools": {
"type": "local",
"command": "uvx",
"args": ["guidance-lark-mcp"],
"env": {
"ENABLE_GENERATION": "true",
"OPENAI_API_KEY": "your-key-here"
},
"tools": ["*"]
}
}
}
{
"mcpServers": {
"grammar-tools": {
"command": "uvx",
"args": ["guidance-lark-mcp"],
"env": {
"ENABLE_GENERATION": "true",
"OPENAI_API_KEY": "your-key-here"
}
}
}
}
validate_grammar — Validate grammar completeness and consistency using llguidance's built-in validator.
{"grammar": "start: \"hello\" \"world\""}
run_batch_validation_tests — Run batch validation tests from a JSON file against a grammar. Returns pass/fail statistics and detailed failure info.
{
"grammar": "start: /[0-9]+/",
"test_file": "tests.json"
}
Test file format:
[
{"input": "123", "should_parse": true, "description": "Valid number"},
{"input": "abc", "should_parse": false, "description": "Not a number"}
]
get_llguidance_documentation — Fetch the llguidance grammar syntax documentation from the official repo.
generate_with_grammar (optional, requires ENABLE_GENERATION=true) — Generate text using an OpenAI model constrained by a grammar. Uses the Responses API with custom tool grammar format, so output is guaranteed to conform to the grammar. Requires OPENAI_API_KEY environment variable. See Backend Configuration for Azure and other endpoints.
The generate_with_grammar tool uses the OpenAI Python SDK, which natively supports multiple backends via environment variables:
| Backend | Required env vars | Optional env vars | |-------