Server data from the Official MCP Registry
Crawl and analyse websites for SEO errors using Crawlee with SQLite storage
Crawl and analyse websites for SEO errors using Crawlee with SQLite storage
Set these up before or after installing:
Environment variable: OUTPUT_DIR
Environment variable: DEBUG
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-houtini-ai-seo-crawler-mcp": {
"env": {
"DEBUG": "your-debug-here",
"OUTPUT_DIR": "your-output-dir-here"
},
"args": [
"-y",
"@houtini/seo-crawler-mcp"
],
"command": "npx"
}
}
}Valid MCP server (1 strong, 1 medium validity signals). No known CVEs in dependencies. Package registry verified. Imported from the Official MCP Registry. Trust signals: trusted author houtini-ai (6/6 approved).
Scanned 4 files · 1 finding
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Be the first to review this server!