Key takeaways
- Change one line (
mcp.run(transport="sse")) to make any local MCP server remote - Users connect with a URL: no pip install or local dependencies needed
- Deploy with Docker to Railway, any cloud provider, or a VPS
What is a remote MCP server?
Most MCP servers run locally: users install a package, and their AI assistant launches it as a subprocess. A remote MCP server runs on your infrastructure instead. Users connect to a URL.
New to MCP? Read What is an MCP server? first. Before building a remote server, you may want to start with How to build an MCP server to understand the local approach.
{
"mcpServers": {
"your-server": {
"url": "https://your-server.com/mcp"
}
}
}
No pip install. No npx. No local dependencies. The user pastes a URL and they are connected.
When to use remote vs local
| Local (pip/npx) | Remote (URL) | |
|---|---|---|
| User setup | Install package + config | Paste URL |
| Code visibility | User has the code | Code stays on your server |
| Infra cost | None (runs on user's machine) | You pay for hosting |
| Best for | Open source tools, utilities | Proprietary data, AI models, premium services |
| Piracy risk | Low (license key) | None (code never distributed) |
| Offline use | Works offline | Requires internet |
Use local for open-source tools where distribution is the goal. Use remote when you need full control over the code, data, or access.
Building a remote server
The transport difference
Local MCP servers use stdio: they read from stdin and write to stdout. Your AI assistant launches them as a subprocess.
Remote MCP servers use SSE (Server-Sent Events) or Streamable HTTP: they listen on an HTTP port and handle requests over the network.
In code, the only change is the transport:
Local (default):
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("my-server")
@mcp.tool()
def my_tool(query: str) -> str:
return do_thing(query)
if __name__ == "__main__":
mcp.run() # stdio by default
Remote:
import os
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("my-server")
@mcp.tool()
def my_tool(query: str) -> str:
return do_thing(query)
if __name__ == "__main__":
port = int(os.environ.get("PORT", "8000"))
mcp.run(transport="sse", host="0.0.0.0", port=port)
One line changed. Everything else: tool definitions, business logic, error handling: stays the same.
Scaffold with MCP Creator
The fastest way to set up a remote server. Both mcp-creator-typescript and mcp-creator-python support hosting="remote":
scaffold_server(
package_name="my-remote-mcp",
description="A remote hosted MCP server",
tools=["my_tool"],
hosting="remote"
)
This generates:
- Server entry point with HTTP transport (Streamable HTTP for TypeScript, SSE for Python)
Dockerfilefor containerized deployment.env.examplewithPORT=8000README.mdwith remote config examples
Dockerfile
A minimal Dockerfile for a Python MCP server:
FROM python:3.12-slim
WORKDIR /app
COPY . .
RUN pip install --no-cache-dir -e .
EXPOSE 8000
ENV PORT=8000
CMD ["python", "-m", "my_mcp_server.server"]
Build and test locally:
docker build -t my-mcp-server .
docker run -p 8000:8000 my-mcp-server
Then point your AI assistant at http://localhost:8000/mcp to test.
Deployment options
Any platform that runs Docker containers works. Two common paths:
Railway (simplest): Push your code to GitHub, connect the repo in Railway, and it auto-detects the Dockerfile and deploys. You get a URL like https://my-server.up.railway.app. Set environment variables in their dashboard.
Any cloud or VPS: Deploy the Docker container to AWS (ECS/Lambda), GCP (Cloud Run), Fly.io, or a $5/month VPS with docker compose up -d. Use a reverse proxy (nginx, Caddy) for HTTPS.
Authentication
Remote servers need authentication. Users should not be able to connect without credentials.
API key in headers
The simplest approach: users include an API key in their MCP config:
{
"mcpServers": {
"your-server": {
"url": "https://your-server.com/mcp",
"headers": {
"Authorization": "Bearer mcp_live_abc123..."
}
}
}
}
On your server, check the header on each request. The MCP SDK does not handle auth for you: add middleware or check the header in your tool handlers.
License key verification
If you are selling through MCP Marketplace, use the mcp-marketplace-license SDK:
from mcp_marketplace_license import verify_license
@mcp.tool()
def premium_tool(query: str) -> str:
# For remote: read key from request header or query param
result = verify_license(slug="my-server")
if not result.get("valid"):
return "Invalid or missing license key."
return do_premium_thing(query)
See the license key guide for the full verification flow.
Combining remote + paid
For a premium remote server: paid access, code never distributed:
scaffold_server(
package_name="my-premium-mcp",
tools=["free_tool", "premium_tool"],
paid=true,
paid_tools=["premium_tool"],
hosting="remote"
)
This generates a server with:
- SSE transport for remote hosting
- License verification on premium tools
- Dockerfile for deployment
- Config examples with auth headers
This is the most secure monetization model. Users never get your code, and access is controlled per-request.
Things to consider
Latency. Local servers respond in milliseconds. Remote servers add network round-trip time. For most tools this is not noticeable, but for tools that the AI calls many times in a loop, the latency adds up.
Uptime. Your server is now a service. If it goes down, your users' tools stop working. Use health checks, monitoring, and redundancy for production servers.
Costs. You are paying for hosting. Price your server accordingly: remote servers typically justify subscription pricing ($10-50+/mo) rather than one-time fees.
Rate limiting. Without rate limiting, a single user can overwhelm your server. Add per-key rate limits, especially for expensive operations.
Next steps
- How to monetize your MCP server: the full guide to both local and remote pricing models
- License keys for MCP servers: how verification works for both local and remote
- Free vs Pro: the freemium strategy for MCP servers
- MCP Creator: scaffold servers with AI in TypeScript or Python
- How to install an MCP server: how users connect to your remote server from their AI client