Server data from the Official MCP Registry
Compression Ai automation via MCP. Includes estimate ratio, suggest algorithm, calculate sav...
Compression Ai automation via MCP. Includes estimate ratio, suggest algorithm, calculate sav...
This MCP server has critical security issues that prevent safe use. The server imports an external authentication middleware from a hardcoded local path (~/clawd/meok-labs-engine/shared) that is not included in the repository, creating an opaque dependency and making the auth implementation unverifiable. Additionally, all API key parameters are passed in plaintext through tool arguments with no HTTPS enforcement, rate limiting relies on in-memory state (not persistent), and there is no input validation on data sizes despite claims of 1MB limits. The permissions are appropriate for a compression utility, but the authentication and dependency issues are severe. Supply chain analysis found 3 known vulnerabilities in dependencies (0 critical, 3 high severity). Package verification found 1 issue.
6 files analyzed · 14 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-csoai-org-compression-ai-mcp": {
"args": [
"-y",
"compression-ai-mcp"
],
"command": "npx"
}
}
}From the project's GitHub README.
Installation · Docs · Report Bug
pip install compression-ai-mcp
# or
npm install -g @meok-ai/compression-ai-mcp
See the project repository for full documentation and examples.
MIT © CSOAI
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.