Server data from the Official MCP Registry
Provision private AI model endpoints on dedicated GPUs (Llama, Qwen, Mistral). Pay per minute.
Provision private AI model endpoints on dedicated GPUs (Llama, Qwen, Mistral). Pay per minute.
Remote endpoints: streamable-http: https://api.auxen.ai/mcp
Valid MCP server (1 strong, 1 medium validity signals). No known CVEs in dependencies. Imported from the Official MCP Registry.
Endpoint verified · Requires authentication · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Remote Plugin
No local installation needed. Your AI client connects to the remote endpoint directly.
Add this to your MCP configuration to connect:
{
"mcpServers": {
"ai-auxen-auxen": {
"url": "https://api.auxen.ai/mcp"
}
}
}From the project's GitHub README.
This is the public manifest repo for the Auxen MCP (Model Context Protocol) server. The server itself runs at https://api.auxen.ai/mcp — this repo exists so registries (Smithery, Glama, the official MCP registry) have a canonical place to read metadata from.
The Auxen MCP server is a remote, StreamableHTTP server. Add it to your MCP client by URL:
https://api.auxen.ai/mcp
Authentication uses OAuth 2.1 + PKCE (recommended for browser-based clients) or a direct Auxen API key (auxen_live_* / auxen_test_*) sent as Authorization: Bearer <key>.
The discovery metadata is at:
https://api.auxen.ai/.well-known/oauth-authorization-server (RFC 8414)https://api.auxen.ai/.well-known/oauth-protected-resource (RFC 9728)Clients that support Dynamic Client Registration (RFC 7591) — including Claude.ai's Connectors Directory — can register themselves automatically. After registration the client redirects the user's browser to https://api.auxen.ai/oauth/authorize, the user logs in to Auxen and approves the connection on https://auxen.ai/oauth/authorize, and the client receives an authorization code that exchanges for an access token at https://api.auxen.ai/oauth/token.
For agents that don't go through a browser, generate an auxen_live_* (or auxen_test_*) key at https://auxen.ai/dashboard/api-keys and send it as Authorization: Bearer <key> on every MCP call.
| Tool | Effect | Hint |
|---|---|---|
auxen_list_models | List available models, optionally filtered by size | read-only |
auxen_get_instance_status | Get status, endpoint, api_key for an instance | read-only |
auxen_list_instances | List all instances on the account | read-only |
auxen_get_balance | Read USD credits + active subscriptions | read-only |
auxen_provision_model | Provision a new model instance — spends money | destructive |
auxen_destroy_instance | Destroy an instance — irreversible | destructive |
Auxen provisions private, dedicated GPU instances running open-source models (Llama 3.1, Qwen 2.5, Mistral, Gemma 2, Phi-3). Each instance is fully private — no shared inference, no third-party routing. Pay-per-minute billing, no subscriptions.
For the human-facing documentation:
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.