Server data from the Official MCP Registry
Context portability MCP server for compressing, formatting, and resuming AI conversation handoffs.
Context portability MCP server for compressing, formatting, and resuming AI conversation handoffs.
Valid MCP server (3 strong, 1 medium validity signals). No known CVEs in dependencies. Package registry verified. Imported from the Official MCP Registry.
5 files analyzed · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: OPENAI_API_KEY
Environment variable: OPENAI_MODEL
Environment variable: CONTEXTBRIDGE_HOSTED_URL
Environment variable: CONTEXTBRIDGE_HOSTED_API_KEY
Environment variable: CONTEXTBRIDGE_HOSTED_MODEL
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-prateekg7-context-bridge": {
"env": {
"OPENAI_MODEL": "your-openai-model-here",
"OPENAI_API_KEY": "your-openai-api-key-here",
"CONTEXTBRIDGE_HOSTED_URL": "your-contextbridge-hosted-url-here",
"CONTEXTBRIDGE_HOSTED_MODEL": "your-contextbridge-hosted-model-here",
"CONTEXTBRIDGE_HOSTED_API_KEY": "your-contextbridge-hosted-api-key-here"
},
"args": [
"-y",
"@contextbridge_ai/mcp"
],
"command": "npx"
}
}
}From the project's GitHub README.
ContextBridge is a context portability tool for AI conversations. This package publishes the MCP server surface: it compresses an in-progress conversation into a structured context block that preserves the user's goal, confirmed decisions, current state, open threads, artifacts, constraints, recent pivots, and suggested next step, then formats that handoff so another AI can continue without forcing the user to restate everything.
The published npm artifact is the stdio MCP server for Claude Desktop, agent runtimes, and other MCP-compatible hosts. The browser extension runtime is developed in the same repository, but it is not part of this MCP Registry publishing flow.
Add this to your Claude Desktop MCP config (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"contextbridge": {
"command": "npx",
"args": ["-y", "@contextbridge_ai/mcp"],
"env": {
"GROQ_API_KEY": "your_groq_api_key_here"
}
}
}
}
Or with OpenAI:
{
"mcpServers": {
"contextbridge": {
"command": "npx",
"args": ["-y", "@contextbridge_ai/mcp"],
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"OPENAI_MODEL": "gpt-4.1-mini"
}
}
}
}
For a hosted no-key lane instead of BYOK:
{
"mcpServers": {
"contextbridge": {
"command": "npx",
"args": ["-y", "@contextbridge_ai/mcp"],
"env": {
"CONTEXTBRIDGE_HOSTED_URL": "https://your-hosted-endpoint.example/v1",
"CONTEXTBRIDGE_HOSTED_MODEL": "mistralai/Mistral-7B-Instruct-v0.2"
}
}
}
}
compress_contextCompresses a raw transcript into a structured context block.
Parameters:
transcript (string, required): the raw conversation transcript to compress.provider ("openai" | "openai-chat" | "google" | "groq" | "custom" | "hosted", optional): provider override.model (string, optional): model override for the selected lane.apiKey (string, optional): BYOK API key override.baseUrl (string, optional): API base URL override.hostedApiKey (string, optional): optional auth key for the hosted lane.format_for_targetFormats a compressed context block for a destination platform.
Parameters:
context_block (object, required): a validated ContextBridge compressed context block.target ("claude" | "chatgpt" | "gemini", required): destination AI platform.resume_contextBuilds a destination-ready landing package from a compressed context block.
Parameters:
context_block (object, required): a validated ContextBridge compressed context block.target ("claude" | "chatgpt" | "gemini", required): destination AI platform.ContextBridge auto-selects the OpenAI Responses lane when a BYOK key is present.
Environment variables:
OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=gpt-4.1-mini
OPENAI_BASE_URL=https://api.openai.com/v1
Additional supported provider overrides:
GOOGLE_API_KEY=your_google_api_key_here
GOOGLE_MODEL=gemini-2.5-pro
GOOGLE_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai
GROQ_API_KEY=your_groq_api_key_here
GROQ_MODEL=llama-3.3-70b-versatile
GROQ_BASE_URL=https://api.groq.com/openai/v1
CONTEXTBRIDGE_API_KEY=your_custom_api_key_here
CONTEXTBRIDGE_MODEL=your_custom_model_here
CONTEXTBRIDGE_BASE_URL=https://your-endpoint.example/v1
CONTEXTBRIDGE_PROVIDER=custom
The hosted lane is the no-key path for end users who do not bring their own provider credentials.
Environment variables:
CONTEXTBRIDGE_HOSTED_URL=https://your-hosted-endpoint.example/v1
CONTEXTBRIDGE_HOSTED_MODEL=mistralai/Mistral-7B-Instruct-v0.2
CONTEXTBRIDGE_HOSTED_API_KEY=
Selection priority:
apiKey option passed into compression uses the OpenAI Responses BYOK lane.OPENAI_API_KEY uses the OpenAI Responses BYOK lane.CONTEXTBRIDGE_HOSTED_URL uses the hosted lane.Contributions are welcome. The highest-leverage areas are extraction quality evaluation, browser collector resilience, destination formatting quality, hosted-lane infrastructure, and real-world handoff testing across platforms.
MIT. See LICENSE.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.