Server data from the Official MCP Registry
MCP server for the Open Archives genealogical search engine.
MCP server for the Open Archives genealogical search engine.
A well-structured MCP server that acts as a proxy to the Open Archieven genealogical API. The code is clean with proper input validation via Zod schemas, reasonable error handling, and appropriate use of rate limiting. Origin validation prevents DNS-rebinding attacks on the MCP endpoint. Minor code quality concerns around broad exception handling and sensitive parameter logging do not significantly impact security given the server's read-only, proxy nature. Supply chain analysis found 10 known vulnerabilities in dependencies (0 critical, 6 high severity). Package verification found 1 issue.
4 files analyzed · 16 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-coret-openarchieven-mcp-server": {
"args": [
"-y",
"@coret/openarchieven-mcp-server"
],
"command": "npx"
}
}
}From the project's GitHub README.
Production-grade hybrid MCP + HTTP + SSE server generated from the Open Archives OpenAPI specification.
OpenAPI source used to generate tools:
../api/openapi.yaml (local)
https://api.openarchieven.nl/openapi.yaml (remote)
A schema-aware server that automatically converts the OpenAPI specification into callable tools and exposes them through multiple transports:
A hosted endpoint is available — no installation required.
In claude.ai or Claude Desktop:
https://mcp.openarchieven.nl/No authentication is required — Open Archives is a public dataset.
Once the connector is added you can ask Claude, for example:
Claude will call the matching tool (search_records, show_record,
get_marriages, get_historical_weather, get_census_data, …) and
return links to the corresponding record pages on
https://www.openarchieven.nl.
If you prefer running the server locally as a stdio MCP server:
npx -y @coret/openarchieven-mcp-server
Every API operation becomes a tool automatically via generate.ts.
All 17 operations:
| Tool Name | Description |
|---|---|
search_records | Search genealogical records |
show_record | Show a single genealogical record |
match_record | Match a person to birth and death records |
get_births_years_ago | List births from N years ago |
get_births | Find birth records |
get_deaths | Find death records |
get_marriages | Find marriage records |
get_archives | List all archives with statistics |
get_record_stats | Record count per archive |
get_source_type_stats | Record count per source type |
get_event_type_stats | Record count per event type |
get_comment_stats | Comment count statistics |
get_family_name_stats | Family name frequency |
get_first_name_stats | First name frequency |
get_profession_stats | Profession frequency |
get_historical_weather | Historical weather from KNMI |
get_census_data | Dutch census data 1795–1899 |
Note: The
callback(JSONP) parameter present in the upstream API is excluded from all tools — it is irrelevant in an MCP/JSON-RPC context.
Uses actual OpenAPI parameter schemas. Validates:
POST / ← canonical public endpoint (mcp.openarchieven.nl)
POST /mcp ← local / legacy alias
Stateless JSON-RPC transport — a new MCP server instance is created per request.
Origin validation: Browser requests must come from
claude.ai,claude.com, or any domain listed inALLOWED_ORIGINS. Requests with noOriginheader (native MCP clients,curl, server-to-server) are accepted. Unknown origins receive HTTP 403.
GET /tools
POST /tools/:name
GET /events/:name
POST /stream/:name
Streaming endpoints (/events/:name, /stream/:name) automatically paginate through results for endpoints that support a start offset:
start by number_show per page: heartbeat comment every 10 seconds to keep connections aliveOptional Redis support.
If Redis is running:
CACHE_TTL)If Redis is unavailable:
The upstream API enforces 4 requests per second per IP. The server queues all upstream calls through a token-bucket rate limiter (configurable via RATE_LIMIT_RPS).
GET /health
generate.ts
server.ts
tsconfig.json
package.json
.env.example
generated/
tools.json
spec.json
Copy .env.example to .env and adjust:
cp .env.example .env
| Variable | Default | Description |
|---|---|---|
PORT | 3001 | HTTP port |
OPENAPI_PATH | ../api/openapi.yaml | Path or URL to OpenAPI spec |
UPSTREAM_BASE | https://api.openarchieven.nl/1.1 | Upstream API base URL |
RATE_LIMIT_RPS | 4 | Upstream requests per second |
REDIS_URL | redis://localhost:6379/5 | Redis connection URL (db 5) |
CACHE_TTL | 3600 | Cache TTL in seconds |
LOG_LEVEL | info | trace debug info warn error fatal |
NODE_ENV | (unset) | Set to production for JSON logs (default: pretty-printed) |
ALLOWED_ORIGINS | (empty) | Extra Origin headers allowed on the MCP endpoint (comma-separated). Claude domains and requests without an Origin header are always allowed. |
npm install
Run from local spec:
npx tsx generate.ts
Or from remote URL:
npx tsx generate.ts https://api.openarchieven.nl/openapi.yaml
Expected result:
Generated 17 tools
Output: generated/tools.json, generated/spec.json
Creates:
generated/tools.json
generated/spec.json
npx tsx server.ts
Expected startup (development — pretty-printed):
[12:00:00] INFO: Open Archieven MCP server started
port: 3001
tools: 17
upstream: "https://api.openarchieven.nl/1.1"
rateLimit: "4 req/s"
redis: "redis://localhost:6379/5"
env: "development"
In production (NODE_ENV=production) each log line is a single JSON object.
Server binds to:
http://0.0.0.0:3001
curl http://localhost:3001/health
Expected:
{
"ok": true,
"tools": 17,
"redis": false,
"uptime": 1.23
}
curl http://localhost:3001/tools
Expected:
[
"search_records",
"show_record",
"match_record",
"get_births_years_ago",
"get_births",
"get_deaths",
"get_marriages",
"get_archives",
"get_record_stats",
"get_source_type_stats",
"get_event_type_stats",
"get_comment_stats",
"get_family_name_stats",
"get_first_name_stats",
"get_profession_stats",
"get_historical_weather",
"get_census_data"
]
curl -X POST http://localhost:3001/tools/search_records \
-H "Content-Type: application/json" \
-d '{"name":"Coret"}'
curl -X POST http://localhost:3001/tools/show_record \
-H "Content-Type: application/json" \
-d '{"archive":"hua","identifier":"E13B9821-C0B0-4AED-B20B-8DE627ED99BD"}'
curl -N "http://localhost:3001/events/search_records?name=Coret"
Expected stream:
event: page
data: {...}
event: page
data: {...}
event: done
data: {}
Leave SSE open for 15+ seconds — expect periodic keep-alive lines:
: heartbeat
curl -N -X POST http://localhost:3001/stream/search_records \
-H "Content-Type: application/json" \
-d '{"name":"Coret"}'
Expected (newline-delimited JSON):
{"query":{...},"response":{"number_found":...,"docs":[...]}}
{"query":{...},"response":{"number_found":...,"docs":[...]}}
curl -X POST http://localhost:3001/ \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2025-03-26",
"capabilities": {},
"clientInfo": { "name": "test", "version": "1.0" }
}
}'
curl -X POST http://localhost:3001/ \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/list"
}'
curl -X POST http://localhost:3001/ \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "search_records",
"arguments": { "name": "Coret" }
}
}'
redis-server
Restart the MCP server. Expected in /health:
{ "redis": true }
Stop Redis and restart. Expected:
{ "redis": false }
npx tsx generate.ts
npx tsx server.ts
npx tsx generate.ts
Linux / macOS:
lsof -i :3001
kill -9 <PID>
Windows:
netstat -ano | findstr :3001
taskkill /PID <PID> /F
Server runs normally without Redis. Check REDIS_URL in .env.
The upstream API allows 4 req/s per IP. The built-in rate limiter queues requests automatically. If you are running multiple server instances, reduce RATE_LIMIT_RPS or use a shared queue.
This server is a thin proxy over the public Open Archives API. It does not require user authentication and does not collect personal data of its own.
The full privacy policies of the operators apply in addition to this section:
mcp.openarchieven.nl.https://api.openarchieven.nl/1.1 to fulfill the request, and the
upstream response is returned to the caller.stdout via pino. On the hosted
endpoint these logs are ephemeral: they are not written to disk and
are lost on process restart. Set LOG_LEVEL=warn to suppress argument
logging.mcp:<tool>:<sorted-params-json>. The
cache contains response bodies only; no user identifiers are stored.No data is sent to any service other than the upstream Open Archives API listed above. There are no analytics, telemetry, advertising, or observability third parties involved.
| Data | Retention |
|---|---|
| Tool arguments and responses | Not persisted by the application |
| Application logs | Ephemeral (stdout, lost on restart) |
| Redis cache entries | CACHE_TTL seconds (default 1 hour), then evicted |
| Reverse-proxy access logs | Per the hosting provider's standard retention policy |
The MCP endpoint validates the Origin header on every request and
rejects unknown browser origins (DNS-rebinding defense). All transport is
over HTTPS.
Tool responses include URLs that point to record pages on
https://www.openarchieven.nl. The submission declares the following
allowed link URI so users are not prompted to confirm each link:
https://www.openarchieven.nlFor privacy questions or requests, contact:
genealogie@coret.orgv1.0
Schema-perfect OpenAPI-generated MCP server for Open Archives.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.