Server data from the Official MCP Registry
Search and inspect 6,500+ curated AI apps from the HyperStore directory.
Search and inspect 6,500+ curated AI apps from the HyperStore directory.
Remote endpoints: streamable-http: https://mcp.store.hypergpt.ai/mcp sse: https://mcp.store.hypergpt.ai/sse
HyperStore MCP is a well-structured, read-only wrapper around a public REST API with no authentication requirements or sensitive operations. The codebase demonstrates good security practices: proper error handling, input validation via Pydantic, context managers for resource cleanup, and no credential storage. Permissions align well with the server's purpose (read-only API calls to HyperStore). Minor code quality observations exist but do not present meaningful security risks. Supply chain analysis found 3 known vulnerabilities in dependencies (0 critical, 3 high severity). Package verification found 1 issue.
7 files analyzed · 7 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Available as Local & Remote
This plugin can run on your machine or connect to a hosted endpoint. during install.
From the project's GitHub README.
Plug 6,500+ AI apps into any LLM via the Model Context Protocol.
HyperStore is a curated directory of 6,500+ AI applications, developed by HyperGPT. This MCP server exposes the HyperStore catalog to any LLM client — Claude, ChatGPT, Cursor, Windsurf, Cline, Zed, Gemini, and anything else that speaks MCP.
Ask your LLM:
"Find me a free AI tool that summarises PDFs." "Compare ChatGPT, Claude, and Gemini side-by-side." "Show me the top 5 image-generation apps with an API."
The LLM calls HyperStore MCP behind the scenes and answers with up-to-date, curated results.
8 tools:
| Tool | Purpose |
|---|---|
search_apps | Full-text keyword search |
ai_search | Embedding-based semantic search |
get_app | Full app detail (features, screenshots, pricing) |
list_apps | Paginated apps with filters (category, pricing) |
list_categories | Browse all 30+ categories |
category_apps | Apps within a category |
browse_apps | A-Z directory listing |
get_homepage | Trending + top categories overview |
3 resources:
hyperstore://app/{slug} — markdown rendering of any apphyperstore://category/{slug} — top apps in a categoryhyperstore://catalog — full category index3 prompts:
find_tool_for_task — guided discovery for a taskcompare_apps — side-by-side app comparisondiscover_category — explore a topicuvx (zero install, recommended)Requires uv. One command and you're done:
uvx hyperstore-mcp
pipxpipx install hyperstore-mcp
hyperstore-mcp
docker run --rm -p 8080:8080 ghcr.io/deficlow/hyperstore-mcp
# Now MCP Streamable HTTP at http://localhost:8080/mcp
Use our managed Streamable HTTP server:
https://mcp.store.hypergpt.ai/mcp
Edit ~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Restart Claude → tools appear in the 🛠 menu.
claude mcp add hyperstore -- uvx hyperstore-mcp
.cursor/mcp.json (project) or ~/.cursor/mcp.json (global):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
settings.json:
{
"cline.mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
~/.config/zed/settings.json:
{
"context_servers": {
"hyperstore": {
"command": {
"path": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
}
~/.gemini/settings.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
Settings → Connectors → Add custom connector:
https://mcp.store.hypergpt.ai/mcpfrom openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-4.1",
tools=[{
"type": "mcp",
"server_label": "hyperstore",
"server_url": "https://mcp.store.hypergpt.ai/mcp",
"require_approval": "never",
}],
input="Find me 3 free AI tools for writing unit tests.",
)
print(response.output_text)
from anthropic import Anthropic
client = Anthropic()
response = client.messages.create(
model="claude-opus-4-7",
max_tokens=1024,
mcp_servers=[{
"type": "url",
"url": "https://mcp.store.hypergpt.ai/mcp",
"name": "hyperstore",
}],
messages=[{"role": "user", "content": "Top 5 AI image generators?"}],
)
See examples/ for ready-to-paste configs for every supported client.
# Streamable HTTP (modern, ChatGPT/OpenAI/Anthropic)
hyperstore-mcp --transport http --host 0.0.0.0 --port 8080
# Legacy SSE (older MCP clients)
hyperstore-mcp --transport sse --port 8080
The hosted endpoint at https://mcp.store.hypergpt.ai runs the Docker image
behind a CDN — no auth, rate-limited per IP.
All settings come from environment variables (see .env.example):
| Variable | Default | Purpose |
|---|---|---|
HYPERSTORE_API_BASE | https://store.hypergpt.ai | Upstream API base URL |
HYPERSTORE_TIMEOUT | 20 | HTTP timeout in seconds |
HYPERSTORE_USER_AGENT | hyperstore-mcp/{version} | UA string |
MCP_HOST | 0.0.0.0 | Bind host (http/sse only) |
MCP_PORT | 8080 | Bind port (http/sse only) |
LOG_LEVEL | INFO | Logging level |
git clone https://github.com/deficlow/HyperStore-MCP
cd HyperStore-MCP
uv sync --all-extras
uv run pytest
uv run hyperstore-mcp # stdio mode for local testing
Inspect the running server with the official MCP Inspector:
npx @modelcontextprotocol/inspector uvx hyperstore-mcp
HyperStore MCP is a thin async wrapper around the HyperStore public REST API. It is read-only — no credentials, no writes, no PII. The same data that powers the website powers the MCP server. Updates land in your LLM the moment they land on the site.
LLM client ──MCP──▶ hyperstore-mcp ──HTTPS──▶ store.hypergpt.ai/api
MIT © HyperGPT
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Web content fetching and conversion for efficient LLM usage
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.