Server data from the Official MCP Registry
Content-addressed, ed25519-signed memory of every place on Earth. Apache-2.0, no keys for reads.
Content-addressed, ed25519-signed memory of every place on Earth. Apache-2.0, no keys for reads.
Remote endpoints: streamable-http: https://emem.dev/mcp
Valid MCP server (1 strong, 1 medium validity signals). No known CVEs in dependencies. Imported from the Official MCP Registry. 1 finding(s) downgraded by scanner intelligence.
27 tools verified · Open access · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: EMEM_BIND
Environment variable: EMEM_DATA
Environment variable: EMEM_PUBLIC_URL
Environment variable: EMEM_TLS_DOMAINS
Remote Plugin
No local installation needed. Your AI client connects to the remote endpoint directly.
Add this to your MCP configuration to connect:
{
"mcpServers": {
"io-github-vortx-ai-emem": {
"env": {
"EMEM_BIND": "your-emem-bind-here",
"EMEM_DATA": "your-emem-data-here",
"EMEM_PUBLIC_URL": "your-emem-public-url-here",
"EMEM_TLS_DOMAINS": "your-emem-tls-domains-here"
},
"url": "https://emem.dev/mcp"
}
}
}From the project's GitHub README.
Cite-able, content-addressed, signed memory of every place on Earth. 1 protocol. 8 read primitives. ed25519 receipts. No keys for L0/L1. Apache-2.0. Pure Rust + open data only.
GET /health POST /v1/recall POST /v1/find_similar
GET /v1/agent_card POST /v1/compare POST /v1/diff
GET /openapi.json POST /v1/query_region POST /v1/trajectory
GET /.well-known/emem.json POST /v1/verify POST /v1/intent
GET /v1/demos POST /v1/attest POST /mcp
POST /v1/verify_receipt GET /v1/facts/:cid
emem is a protocol — not a service — for content-addressed Earth memory.
Every fact about every place gets a stable CID derived from the canonical CBOR
of its (cell × band × tslot) payload. Every read is a signed receipt that
any client can verify offline with the responder's ed25519 public key.
emem is built for AI agents: when a user mentions a place, the agent
should call emem and cite receipt.fact_cids[0]. The protocol works equally
well over plain REST, MCP JSON-RPC 2.0, and OpenAPI 3.1 custom actions.
LLMs confabulate spatial facts. Without a verifiable, content-addressed memory layer, every "what is at place X?" answer is unauditable. emem fixes this by:
docker run --rm -p 5051:5051 -v emem-data:/var/emem \
ghcr.io/vortx-ai/emem:latest
curl -s http://localhost:5051/health
A hosted instance lives at
huggingface.co/spaces/vortx-ai/emem.
Hit ${SPACE_URL}/mcp from any MCP client to talk to it.
# 1) Build the workspace.
cargo build --release --workspace
# 2) Run the server (defaults: 0.0.0.0:5051, persistent storage at ./var/emem).
EMEM_BIND=0.0.0.0:5051 EMEM_DATA=./var/emem ./target/release/emem-server
# 3) Hit it.
curl -s http://localhost:5051/health
curl -s -X POST http://localhost:5051/v1/recall \
-H 'content-type: application/json' \
-d '{"cell":"damO.zb000.xUti.zde78"}' # Mt Fuji
Paste-ready configs live under examples/:
| platform | file |
|---|---|
| Claude Desktop | examples/claude-desktop.json |
| Claude Code | examples/claude-code.mcp.json |
| Cursor | examples/cursor.mcp.json |
| Cline (VS Code) | examples/cline.mcp.json |
| OpenAI GPT | examples/openai-gpt-action.json |
| LangChain | examples/langchain.py |
| LlamaIndex | examples/llamaindex.py |
The full agent integration walkthrough is at docs/AGENTS.md.
Two CLI binaries exercise the full protocol against a running server and
write per-step request + response + receipt files to var/demos/<UTC>/:
./target/release/emem-livedemo # synthetic data, every primitive
./target/release/emem-realdemo # real Copernicus DEM 30m S3 tiles
The server exposes the trace artifacts at GET /v1/demos.
┌──────────────┐ ┌────────────────────┐
user ──────► │ AI agent │ ──────► /v1/ │ emem responder │
│ (Claude / │ /mcp │ ┌──────────────┐ │
│ Cursor / │ /openapi.json │ │ ed25519 key │ │
│ GPT / etc) │ │ └──────────────┘ │
└──────┬───────┘ │ ┌──────────────┐ │
│ │ │ sled cache │ │
│ signed receipt │ └──────────────┘ │
▼ │ ┌──────────────┐ │
┌──────────────┐ │ │ merkle log │ │
│ user reply │ │ └──────────────┘ │
│ + cid │ │ ┌──────────────┐ │
└──────────────┘ │ │ vsicurl COG │ ──► open data
│ └──────────────┘ │ (Cop-DEM, JRC,
└────────────────────┘ Hansen, ESA…)
Address algebra (token cost)
| field | bits | wire form | tokens |
|---|---|---|---|
cell | 64 | 4 BPE bigrams | ≤ 4 |
tslot | 64 | base32 short | ≤ 2 |
vec | 1792 D fp16 | 12-byte prefix | ≤ 3 |
cid | 32 B | 8-byte prefix | ≤ 3 |
Crypto: blake3 hashing, ed25519 signatures, base32-nopad-lowercase CIDs.
Receipts are signed over blake3(request_id || served_at || primitive || cells || fact_cids) so any client offline-verifies with the responder pubkey
in /.well-known/emem.json.
Full math + architecture in docs/WHITEPAPER.md. Wire-format spec in docs/SPEC.md.
emem ships with only open-source dependencies and reads only from open-data providers in its default build. No API keys, no operator credentials, no SaaS lock-in.
| concern | how it's handled |
|---|---|
| code license | Apache-2.0 (this repo) |
| crate licenses | All deps are MIT / Apache-2.0 / BSD / ISC — see NOTICE |
| data licenses | Copernicus DEM (open), JRC GSW (CC-BY 4.0), Hansen GFC (open), ESA WorldCover (CC-BY 4.0), GHSL / WorldPop (CC-BY 4.0), OSM (ODbL) — see NOTICE |
| auth | none for L0/L1 reads; ed25519 attester key for L2 writes |
| transport | HTTPS via in-process rustls + Let's Encrypt ACME (no Cloudflare, no proxies) |
emem/
├── Cargo.toml # workspace root
├── crates/
│ ├── emem-core/ # types, manifests, errors
│ ├── emem-codec/ # cell64, cid64, vec64, hilbert
│ ├── emem-fact/ # canonical CBOR + facts + receipts
│ ├── emem-claim/ # structured claims, verify outcomes
│ ├── emem-cache/ # sled hot cache (cell64 → cid64 → fact)
│ ├── emem-fetch/ # vsicurl Range reads, source connectors
│ ├── emem-storage/ # Storage trait, append-only merkle log
│ ├── emem-cubes/ # 1792-D voxel cube loader (legacy AgriSynth bootstrap)
│ ├── emem-primitives/ # recall, compare, find_similar, …
│ ├── emem-attest/ # merkle root, batch verify
│ ├── emem-intent/ # intent → plan
│ ├── emem-mcp/ # MCP tool surface
│ ├── emem-api-rest/ # axum router + OpenAPI + content nego
│ └── emem-cli/ # emem-server, emem-livedemo, emem-realdemo
├── docs/ # SPEC, WHITEPAPER, AGENTS, DEPLOY
├── examples/ # paste-ready MCP configs
└── web/ # landing surface (HTML, JSON, llms.txt)
For a full multi-channel rollout (GitHub public, GHCR, Docker Hub mirror, HuggingFace Space, MCP Server Registry, awesome-mcp-servers PR), follow docs/GO_LIVE.md.
See docs/DEPLOY.md for the full deploy story for a
self-hosted bare-metal emem.dev-style instance.
TL;DR for emem.dev:
EMEM_TLS_DOMAINS=emem.dev,www.emem.dev EMEM_TLS_CONTACT=mailto:avijeet@vortx.ai ./target/release/emem-server:443 in your cloud security list,setcap 'cap_net_bind_service=+ep' ./target/release/emem-server,emem.dev's A record at the host's public IP — done.The server does its own TLS + Let's Encrypt ACME via rustls-acme /
TLS-ALPN-01 (only :443 is needed; no :80, no Cloudflare, no Caddy).
Issues and PRs welcome — see CONTRIBUTING.md for the dev loop, CODE_OF_CONDUCT.md, and SECURITY.md for vulnerability disclosure.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.