Server data from the Official MCP Registry
Pseudonymizes PII before your LLM and returns a cryptographically signed receipt per response.
Pseudonymizes PII before your LLM and returns a cryptographically signed receipt per response.
This is a well-structured privacy-focused MCP server with proper authentication, secure credential handling, and comprehensive certificate verification logic. The codebase demonstrates strong security practices with minimal concerns. Permissions are appropriate for the server's purpose (API gateway middleware), and the implementation includes robust error handling and input validation. Supply chain analysis found 6 known vulnerabilities in dependencies (1 critical, 2 high severity). Package verification found 1 issue.
5 files analyzed · 10 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: LUCAIRN_API_KEY
Environment variable: ANTHROPIC_API_KEY
Environment variable: OPENAI_API_KEY
Environment variable: LUCAIRN_BASE_URL
Environment variable: LUCAIRN_TRANSPORT
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-declade-lucairn-mcp-server": {
"env": {
"OPENAI_API_KEY": "your-openai-api-key-here",
"LUCAIRN_API_KEY": "your-lucairn-api-key-here",
"LUCAIRN_BASE_URL": "your-lucairn-base-url-here",
"ANTHROPIC_API_KEY": "your-anthropic-api-key-here",
"LUCAIRN_TRANSPORT": "your-lucairn-transport-here"
},
"args": [
"-y",
"@lucairn/mcp-server"
],
"command": "npx"
}
}
}From the project's GitHub README.
Official client libraries for Lucairn — an EU-based privacy-preserving AI gateway. Lucairn sits between your application (or AI agent) and the upstream LLM (Claude, GPT-4o, o1/o3/o4) and removes personal data from prompts before the model ever sees them. Every response carries a cryptographically signed compliance certificate proving what was redacted, when, and by which sanitizer layer.
This monorepo hosts four packages at parity:
@lucairn/mcp-server — Model Context Protocol server (one-line npx install for Claude Desktop, Cursor, Cline, Continue, …)@lucairn/sdk — TypeScript / Node SDKlucairn — Python SDKgithub.com/declade/lucairn-sdks/go — Go SDKFor most agent use cases, the fastest path is the MCP server. No build step, no install — npx runs it on demand:
npx -y @lucairn/mcp-server
Add it to your MCP client config (Claude Desktop's claude_desktop_config.json, Cursor's mcp.json, Cline's cline_mcp_settings.json, Continue, etc.):
{
"mcpServers": {
"lucairn": {
"command": "npx",
"args": ["-y", "@lucairn/mcp-server"],
"env": {
"LUCAIRN_API_KEY": "<your_lucairn_api_key>",
"ANTHROPIC_API_KEY": "<optional_byok_anthropic_key>",
"OPENAI_API_KEY": "<optional_byok_openai_key>"
}
}
}
}
Restart your client. The chat_via_lucairn tool becomes available immediately. See mcp-server/README.md for full details.
Each request through any Lucairn SDK follows the same pipeline:
[PERSON_1], [EMAIL_2], [IBAN_3], …) before the request reaches the upstream LLM.For Lucairn-hosted Developer-tier callers, on-gateway pseudonymization happens before your LLM sees the request. Enterprise self-host deployments can run the entire stack inside the customer environment, in which case no raw identity data leaves that environment at all.
The gateway picks the upstream provider from the model parameter you send:
| Model prefix | Upstream provider | BYOK env var |
|---|---|---|
claude-*, anthropic-* | Anthropic | ANTHROPIC_API_KEY |
gpt-*, openai-*, o1-*, o3-*, o4-* | OpenAI | OPENAI_API_KEY |
Cross-provider BYOK shipped in @lucairn/mcp-server@1.1.0 — set one or both keys in the same MCP config and the server forwards the matching one as X-Upstream-Key per request, so your provider account is billed directly.
| Language | Package | Version | README |
|---|---|---|---|
| MCP server | @lucairn/mcp-server | 1.2.0 | mcp-server/README.md |
| TypeScript | @lucairn/sdk | 1.0.0 | ts/README.md |
| Python | lucairn | 1.0.0 | python/README.md |
| Go | github.com/declade/lucairn-sdks/go | v0.1.0 | go/README.md |
All SDKs are at parity at the observable level. Cross-language byte-equivalence is locked via shared Go-assembler-generated fixtures, so a certificate signed via one SDK verifies identically via the other two.
Sign up at https://lucairn.eu/account/signup. Free Developer tier: 500 requests/month, no credit card required.
Pro adds response re-linking, programmatic certificate JSON access, audit-event export, and higher quota. Enterprise adds self-host, BYOK with provider-side billing isolation, and the optional custom-trained PII shield (priced per scope).
See https://lucairn.eu/pricing for the full tier comparison.
Every response through any SDK gets a signed Lucairn certificate. Two surfaces:
getCertificateSummary (TS) / get_certificate_summary (Python) / GetCertificateSummary (Go), or paste the certificate URL into https://lucairn.eu/verify.getCertificate + verifyCertificate (and language equivalents). The verifier is in-tree — see ts/src/verify-certificate/, python/src/lucairn/verify_certificate/, and the internal/verify package under go/.External RFC 3161 + Sigstore Rekor anchor verification is currently surfaced as pass-through metadata; full external anchor verification lands in a follow-up release.
Pre-1.0 monorepo, individual packages tagged per the table above. Cross-language byte-equivalence locked via shared fixtures. Follow CHANGELOG.md for release notes.
@lucairn/mcp-server: https://www.npmjs.com/package/@lucairn/mcp-server@lucairn/sdk: https://www.npmjs.com/package/@lucairn/sdklucairn: https://pypi.org/project/lucairn/See CONTRIBUTING.md. Security reports: SECURITY.md.
MIT — see LICENSE.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.