Server data from the Official MCP Registry
AI Agent-Native Data Platform — ingest, validate, transform, and query data.
AI Agent-Native Data Platform — ingest, validate, transform, and query data.
This MCP server for Datris data platform operations has reasonable architecture but exhibits several moderate security concerns. The server accepts per-session API keys via context variables and makes downstream HTTP calls to a Datris backend, but lacks strict input validation on user-controlled parameters passed to shell operations and external services. The activity monitoring system logs API key hints and user arguments without sufficient truncation controls, and there is inconsistent error handling in HTTP operations. Permissions are appropriate for the stated purpose (network access, file I/O for data ingestion), but input sanitization and logging practices need hardening. Supply chain analysis found 13 known vulnerabilities in dependencies (0 critical, 6 high severity). Package verification found 1 issue.
4 files analyzed · 23 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: PIPELINE_URL
Environment variable: PIPELINE_API_KEY
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-datris-datris": {
"env": {
"PIPELINE_URL": "your-pipeline-url-here",
"PIPELINE_API_KEY": "your-pipeline-api-key-here"
},
"args": [
"datris-mcp-server"
],
"command": "uvx"
}
}
}From the project's GitHub README.
datris.ai · Documentation · MCP Registry · PyPI
Ingest, validate, transform, store, and retrieve your data — whether you're an AI agent talking through MCP or a developer writing config. One platform for both.
git clone https://github.com/datris/datris-platform-oss.git
cd datris-platform-oss
cp .env.example .env # Add your ANTHROPIC_API_KEY and/or OPENAI_API_KEY
docker compose up -d
UI: http://localhost:4200 · API: http://localhost:8080
Add to your MCP client config (Claude Desktop, Claude Code, Cursor, etc.). With the Docker stack running, the npx mcp-remote stdio bridge connects to the bundled MCP server on port 3000 — your client appears in the Datris UI Agent Monitor tab with live tool-call streaming:
{
"mcpServers": {
"datris": {
"command": "npx",
"args": ["-y", "mcp-remote", "http://localhost:3000/sse", "--transport", "sse-only"]
}
}
}
Paste-and-go for the default local setup — no API key required when USE_API_KEYS=false (the OSS default). If your instance enables auth (USE_API_KEYS=true or hosted/multi-tenant), append "--header", "x-api-key:<your-key>" to the args array. The Configuration → Connect Your Agent page generates the snippet for you and adds the header automatically when you paste your key.
Requires Node.js on your PATH (brew install node). For a stdio alternative without Docker, or full Claude Desktop / Claude Code / Cursor walkthroughs, see Configuring Claude.
brew tap datris/tap
brew install datris
datris ingest data.csv --dest postgres
datris ingest sales.csv --ai-validate "prices > 0" --ai-transform "convert dates to YYYY/MM/DD"
datris query "SELECT * FROM sales"
datris search "quarterly revenue" --store pgvector
datris tap create "Fetch S&P 500 daily prices from yfinance" --pipeline stocks
datris taps
Source (File Upload / MinIO Event / Database Pull / Kafka)
→ Preprocessor (optional REST endpoint)
→ Data Quality (AI rules, header validation, schema validation)
→ Transformation (AI transformation, destination schema)
→ Destinations (in parallel):
PostgreSQL, MongoDB, MinIO (Parquet/ORC), Kafka, ActiveMQ,
REST Endpoint, Qdrant, Weaviate, Milvus, Chroma, pgvector
→ Notifications (ActiveMQ topic)
| Feature | Description |
|---|---|
| MCP Server | 47 tools for AI agents — pipeline CRUD, upload, query, search, profiling, taps |
| AI Data Quality | Plain English validation rules — AI generates and runs a validation script |
| AI Transformation | Plain English transformations — AI generates and runs a transformation script |
| AI Schema Generation | Upload a file, get a complete pipeline config |
| AI Data Profiling | Upload a file, get statistics + suggested validation rules |
| AI Error Explanation | Job failures explained in plain English |
| Natural Language Query | Ask questions in English, get SQL results |
| RAG Pipeline | Chunk, embed, and search across 5 vector databases |
CSV, JSON, XML, Excel, PDF, Word (DOCX), plain text
Anthropic Claude (Sonnet 4.6 default, Opus 4.7 for CodeGen) · OpenAI (GPT-5.5) · Ollama (local models, optional). Embeddings via TEI sidecar (BAAI/bge-m3) when using Anthropic, or text-embedding-3-small when using OpenAI.
| Service | Purpose |
|---|---|
| MinIO | S3-compatible object store for file staging and data output |
| PostgreSQL | Default structured destination, also hosts pgvector for RAG |
| MongoDB | Configuration store, job status tracking, metadata |
| ActiveMQ | File notification queue, pipeline event notifications |
| HashiCorp Vault | Secrets management (database credentials, API keys) |
| TEI | Text Embeddings Inference sidecar (BAAI/bge-m3) for vector embeddings without an OpenAI key |
| Apache Kafka | Optional streaming source and destination |
| Apache Spark | Local Spark for writing Parquet/ORC to MinIO |
Full documentation at docs.datris.ai or locally at docs/.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.