Server data from the Official MCP Registry
MCP for CanLII: Canadian case law and legislation metadata (federal, provincial, territorial).
MCP for CanLII: Canadian case law and legislation metadata (federal, provincial, territorial).
Remote endpoints: streamable-http: https://canlii-mcp.vaquill.ai/mcp
This is a well-engineered MCP server for the CanLII legal database with thoughtful authentication design. The codebase implements proper rate limiting, supports bring-your-own-key (BYOK) authentication to avoid key exfiltration, and has no malicious patterns or dangerous code execution vulnerabilities. Minor code quality issues around error handling and input validation do not materially impact security. Supply chain analysis found 7 known vulnerabilities in dependencies (0 critical, 1 high severity).
5 files analyzed · 12 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Available as Local & Remote
This plugin can run on your machine or connect to a hosted endpoint. during install.
From the project's GitHub README.
An MCP (Model Context Protocol) server for the CanLII Canadian legal information API. Gives AI assistants access to Canadian case law and legislation metadata across all federal, provincial, and territorial jurisdictions.
Forked from tomilashy/canlii-mcp. This fork adds bring-your-own-key (BYOK) auth, a
/healthroute, and a hosted endpoint atcanlii-mcp.vaquill.ai. Tools are unchanged.
Note: The CanLII API provides metadata only — titles, citations, dates, keywords, and citation relationships. Full document text is not available through the API.
https://canlii-mcp.vaquill.ai/mcp
Two headers are required for the hosted instance:
Authorization: Bearer <MCP_AUTH_TOKEN> — gates access to the MCP server itselfX-CanLII-Token: <your_canlii_api_key> — your CanLII key. Apply at canlii.org/en/api/. The server never stores your key.{
"mcpServers": {
"canlii": {
"url": "https://canlii-mcp.vaquill.ai/mcp",
"headers": {
"Authorization": "Bearer YOUR_MCP_TOKEN",
"X-CanLII-Token": "YOUR_CANLII_API_KEY"
}
}
}
}
Same pattern: any client supporting MCP streamable HTTP with custom headers works. For stdio-only clients use mcp-remote to proxy.
| Mode | Header | When |
|---|---|---|
| BYOK (preferred) | X-CanLII-Token: <key> | Hosted / shared deployments |
| Server fallback | (env CANLII_API) | Self-hosted single-tenant. Required for stdio. |
| MCP gate | Authorization: Bearer <MCP_AUTH_TOKEN> | Optional. Restricts who may use the hosted endpoint. |
| Tool | Description |
|---|---|
list_case_databases | List all courts and tribunals in the CanLII collection |
list_cases | Browse decisions from a specific court/tribunal database |
get_case | Get metadata for a specific case (title, citation, date, keywords) |
get_case_citations | Get cases cited by a case, cases citing it, or legislation it references |
list_legislation_databases | List all statute and regulation databases |
list_legislation | Browse statutes or regulations from a specific database |
get_legislation | Get metadata for a specific piece of legislation |
{
"mcpServers": {
"canlii": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@tomilashy/canlii-mcp"],
"env": {
"CANLII_API": "your_api_key"
}
}
}
}
npm install
npm run build
node dist/index.js
Add to your MCP config:
{
"mcpServers": {
"canlii": {
"command": "node",
"args": ["/path/to/canlii-mcp/dist/index.js"],
"env": {
"CANLII_API": "your_api_key"
}
}
}
}
PORT=3000 CANLII_API=your_api_key node dist/index.js --transport http
The MCP endpoint is available at http://localhost:3000/mcp. The server runs in stateless mode — each request is self-contained, no session ID or initialize handshake required. Clients can call tools directly:
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"list_case_databases","arguments":{"language":"en"}}}'
docker run -e CANLII_API=your_api_key -e MCP_AUTH_TOKEN=your_secret -p 3000:3000 ghcr.io/tomilashy/canlii-mcp
Or with Docker Compose:
services:
canlii-mcp:
image: ghcr.io/tomilashy/canlii-mcp
environment:
CANLII_API: your_api_key
MCP_AUTH_TOKEN: your_secret # optional
ports:
- "3000:3000"
The server includes a Workers-compatible entry point (src/worker.ts).
npx wrangler secret put CANLII_API
npx wrangler secret put MCP_AUTH_TOKEN # optional
npx wrangler deploy
tomilashy/canlii-mcp repositorycanlii-mcpnpm install && npm run buildnpx wrangler deploy (pre-filled)CANLII_APIThe MCP endpoint will be at https://canlii-mcp.<your-subdomain>.workers.dev/mcp.
| Environment Variable | Required | Default | Description |
|---|---|---|---|
CANLII_API | Yes | — | Your CanLII API key |
PORT | No | 3000 | HTTP server port (HTTP mode only) |
MCP_AUTH_TOKEN | No | — | Bearer token for HTTP authentication. If set, all HTTP requests must include Authorization: Bearer <token>. If not set, the server runs without authentication. |
The server enforces CanLII's API limits automatically:
Requests that exceed the daily limit return an error rather than hitting the API.
npm install
npm run build # compile TypeScript
npm run watch # watch mode
This project uses Semantic Versioning via semantic-release. Commit messages follow the Conventional Commits spec:
| Commit prefix | Release type |
|---|---|
fix: | Patch (1.0.0 → 1.0.1) |
feat: | Minor (1.0.0 → 1.1.0) |
feat!: or BREAKING CHANGE | Major (1.0.0 → 2.0.0) |
Pushing to main triggers the release workflow. If a release is cut, the Docker image is automatically built and published to ghcr.io.
MIT
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.