Crawl and analyse websites for SEO errors using Crawlee with SQLite storage
{
"mcpServers": {
"seo-crawler-mcp": {
"env": {
"OUTPUT_DIR": "C:\\seo-audits"
},
"args": [
"-y",
"@houtini/seo-crawler-mcp"
],
"command": "npx"
}
}
}Crawl and analyse websites for SEO errors using Crawlee with SQLite storage
Is it safe?
No known CVEs for @houtini/seo-crawler-mcp.
No authentication — any process on your machine can connect.
License not specified.
Is it maintained?
Last commit 15 days ago. 11 stars. 10 weekly downloads.
Will it work with my client?
Transport: stdio, sse, http. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
Context cost
4 tools. ~500 tokens (0.3% of 200K).
Run this in your terminal to verify the server starts. Then let us know if it worked — your result helps other developers.
npx -y '@houtini/seo-crawler-mcp' 2>&1 | head -1 && echo "✓ Server started successfully"
After testing, let us know if it worked:
run_seo_auditRun a comprehensive SEO audit crawl on a website. Crawls the site, analyzes pages for SEO issues, and stores results in a local SQLite database.
analyze_seoAnalyze SEO crawl results and generate a summary report with statistics and findings
query_seo_dataExecute specialized SQL queries against crawled SEO data to analyze specific issues like broken links, missing meta tags, security vulnerabilities, etc.
list_seo_queriesList all available SEO analysis queries with their descriptions, categories, and priority levels
This server is missing a description.If you've used it, help the community.
Add informationNo known vulnerabilities.
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.