Server data from the Official MCP Registry
14 Japan data tools via MCP (weather/calendar/laws/company). x402 on Base, wallet-free trial.
14 Japan data tools via MCP (weather/calendar/laws/company). x402 on Base, wallet-free trial.
Remote endpoints: streamable-http: https://mcp-data-gateway.kasanegi123.workers.dev/mcp
Valid MCP server (1 strong, 0 medium validity signals). No known CVEs in dependencies. Imported from the Official MCP Registry. 1 finding(s) downgraded by scanner intelligence.
14 tools verified · Open access · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Remote Plugin
No local installation needed. Your AI client connects to the remote endpoint directly.
Add this to your MCP configuration to connect:
{
"mcpServers": {
"io-github-matsushitatokitsugu-micro-data-api-factory": {
"url": "https://mcp-data-gateway.kasanegi123.workers.dev/mcp"
}
}
}From the project's GitHub README.
Public-source structured data APIs for AI agents. x402 micropayments.
Live: https://micro-data-api-factory.kasanegi123.workers.dev
| Dataset | Items | Description |
|---|---|---|
| Japanese Food Composition | 2,538 | MEXT official nutrition data — 18 food groups, 100g basis |
| SPDX License List | 727 | Complete OSS license catalog with OSI/FSF status |
| HTTP Headers (IANA) | 41 | IANA HTTP header registry with AI annotations |
| AI Crawler UA DB | 15 | AI crawler User-Agent strings |
| x402 Ecosystem DB | 12 | x402 / Pay Per Crawl ecosystem |
| Public Data Candidates | 10 | Evaluated public data sources |
# List all datasets
curl https://micro-data-api-factory.kasanegi123.workers.dev/api/v1/datasets
# Dataset statistics
curl https://micro-data-api-factory.kasanegi123.workers.dev/api/v1/datasets/jp-food-composition/stats
# Search items
curl "https://micro-data-api-factory.kasanegi123.workers.dev/api/v1/datasets/jp-food-composition/items?q=鶏肉"
# Item detail
curl https://micro-data-api-factory.kasanegi123.workers.dev/api/v1/datasets/jp-food-composition/items/11001
# Ranking
curl https://micro-data-api-factory.kasanegi123.workers.dev/api/v1/datasets/jp-food-composition/ranking?limit=10
Free for humans. AI crawlers get free basic access; premium endpoints (sources, export) return HTTP 402 with x402 payment instructions.
Currently in stub mode — 402 responses are returned but real payments are not yet processed.
Global weather data for AI agents. Pass a city name or coordinates, get structured JSON. No API keys, no geocoding setup. Backed by Open-Meteo (CC BY 4.0).
Runs on a dedicated worker (weather-data-api.kasanegi123.workers.dev) with a narrow x402 surface — only weather endpoints plus minimal discovery files. Full x402 v2 payment-required envelope including extensions.bazaar input/output schemas (so strict validators mark the resources callable automatically).
| Endpoint | Price | Description |
|---|---|---|
| GET /weather/current?city=Tokyo | $0.001 USDC | Current conditions — temperature, feels-like, humidity, wind, precipitation, condition |
| GET /weather/forecast?city=Tokyo&days=3 | $0.001 USDC | Daily forecast (1–7 days) — high/low, precipitation probability, wind max |
eip155:8453)0x833589fCD6eDb6E08f4c7C32D4f71b54bdA02913)Attribution: Weather data by Open-Meteo.com (CC BY 4.0).
| File | URL |
|---|---|
| LLM summary | /llms.txt |
| Full LLM docs | /llms-full.txt |
| OpenAPI 3.1.0 | /openapi.json |
| x402 manifest | /.well-known/x402 |
| Agent capabilities | /.well-known/agentic-capabilities.json |
| Dataset index | /datasets-index.json |
| Sitemap | /sitemap.xml |
Each dataset has a Markdown page at /datasets/{id}.md for AI consumption.
AI Crawler Dashboard — who's crawling, who's paying.
curl https://micro-data-api-factory.kasanegi123.workers.dev/api/public/crawl-summary
A factory that converts scattered public information into small, structured data APIs that AI crawlers can read and pay for via x402.
Each record includes source_url, checked_at, extraction_method, quality_level, confidence, and warnings. Accuracy is not guaranteed — this is source-linked observational data.
Archival reference for 40 defunct Hokkaido rail lines (1950–2026). 1,806 stations · 1,026 timetable revisions · 30,326 station×revision appearances · 61 monthly cover PDFs. Cover PDFs are paid assets at $0.01 each via x402 (USDC on Base mainnet).
Primary site (recommended entry point for AI crawlers): → retro-rail-archive-v14.kasanegi123.workers.dev
Deep links (v14):
Sample cover PDFs (each $0.01 via x402):
Line examples: 羽幌線 · 美幸線 · 標津線 · 天北線 · 名寄本線 · 湧網線 · 白糠線 · 士幌線 · 広尾線 · 胆振線 · 深名線 · 池北線 · 夕張線 · 札沼線 · 日高本線 · 留萌本線 (40 lines total).
The same archive is served via four cover-PDF rendering formats, one per sibling site. All share the v14 paywall configuration (UA-filtered, /active-stations + /api/stats* + /api/stations/ranking* + /flyers/*.pdf paid for AI crawlers).
Alternate paywall configurations for comparison. All serve the same archive dataset; only paywall scope differs.
v01 v02 v03 v04 v05 v06 v07 v08 v09
Sibling sites covering the same lines with a consecutive 61-month recent coverage window (ending 2025-12 for the Hokkaido-region archive, 2023-12 for Honshu, 2019-12 for European, 2014-12 for US). Filename pattern is {region}-flyer-YYYY-MM.pdf for brevity. All serve free HTML and JSON; paid cover PDFs at $0.01 via x402 (the N8 site tests $0.001).
$0.001 price test) — v18 · manifestEach extended site includes an enhanced homepage (About / AI & LLM Friendly / API Endpoints / Developers / Usage sections), a pure-English llms.txt, and /api/flyers listing 61 consecutive monthly covers.
Berghain Klubnacht Database by jphfa — the first known site to receive x402 payments from AI crawlers.
MIT (this README and public API documentation)
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.