Server data from the Official MCP Registry
Memori MCP server — persistent AI memory with recall and augmentation tools
Memori MCP server — persistent AI memory with recall and augmentation tools
Valid MCP server (2 strong, 3 medium validity signals). 2 known CVEs in dependencies (0 critical, 2 high severity) Package registry verified. Imported from the Official MCP Registry.
3 files analyzed · 3 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: MEMORI_API_KEY
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-memorilabs-memori-mcp": {
"env": {
"MEMORI_API_KEY": "your-memori-api-key-here"
},
"args": [
"-y",
"@memorilabs/memori-mcp"
],
"command": "npx"
}
}
}From the project's GitHub README.
Persistent AI memory for any MCP-compatible agent — no SDK required.
memori-mcp is the official Memori MCP server. Connect it to your AI agent to give it long-term memory: recall relevant facts before answering, store durable preferences after responding, and maintain context across sessions.
Memori turns stateless agents into stateful systems by providing structured, persistent memory that works across sessions and workflows.
Memori is state infrastructure for production agents — enabling persistent memory, efficient retrieval, and structured context across both natural language and agent execution.
Memori was evaluated on the LoCoMo benchmark for long-conversation memory and achieved 81.95% overall accuracy while using an average of 1,294 tokens per query. That is just 4.97% of the full-context footprint, showing that structured memory can preserve reasoning quality without forcing large prompts into every request.
Compared with other retrieval-based memory systems, Memori outperformed Zep, LangMem, and Mem0 while reducing prompt size by roughly 67% vs. Zep and lowering context cost by more than 20x vs. full-context prompting.
Read the benchmark overview or download the paper.
The server exposes two tools:
| Tool | When to call | What it does |
|---|---|---|
recall | Start of each user turn | Fetches relevant memories for the current query |
advanced_augmentation | After composing a response | Stores durable facts and preferences for future sessions |
Given the message: "I prefer Python and use uv for dependency management."
recall with the user message as queryadvanced_augmentation with the user message and responseOn a later turn — "Write a hello world script" — the agent recalls the Python + uv preference and personalizes its response automatically.
entity_id to identify the end user (e.g. user_123)process_id to identify the agent or workflow (e.g. my_agent)Export these in your shell or replace the placeholders directly in your config:
export MEMORI_API_KEY="your-memori-api-key"
export MEMORI_ENTITY_ID="user_123"
export MEMORI_PROCESS_ID="my_agent" # optional
| Property | Value |
|---|---|
| Endpoint | https://api.memorilabs.ai/mcp/ |
| Transport | Stateless HTTP |
| Auth | API key via request headers |
| Header | Required | Description |
|---|---|---|
X-Memori-API-Key | Yes | Your Memori API key |
X-Memori-Entity-Id | Yes | Stable end-user identifier (e.g. user_123) |
X-Memori-Process-Id | No | Process, app, or workflow identifier for memory isolation |
session_id is derived automatically as <entity_id>-<UTC year-month-day:hour> — you do not need to provide it.
After configuring any client:
recall and advanced_augmentation appear in the tools listrecall should return a response (even if empty for new entities)advanced_augmentation returns memory being createdIf you receive 401 errors, double-check your X-Memori-API-Key value. See the Troubleshooting guide for more help.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Web content fetching and conversion for efficient LLM usage
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.