Server data from the Official MCP Registry
A Snowflake MCP server — SQL queries, schema exploration, and data insights for AI assistants
A Snowflake MCP server — SQL queries, schema exploration, and data insights for AI assistants
A well-structured Snowflake MCP server with strong authentication options (multiple methods, TOML configs, env vars) and good code quality (type hints, logging, error handling). Write operations are safely guarded by default. However, the server requires broad network and file permissions (appropriate for its purpose) and has some input validation gaps in SQL identifier checking that could enable SQL injection in edge cases. Supply chain analysis found 3 known vulnerabilities in dependencies (0 critical, 3 high severity). Package verification found 1 issue.
4 files analyzed · 9 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: SNOWFLAKE_ACCOUNT
Environment variable: SNOWFLAKE_USER
Environment variable: SNOWFLAKE_PASSWORD
Environment variable: SNOWFLAKE_WAREHOUSE
Environment variable: SNOWFLAKE_DATABASE
Environment variable: SNOWFLAKE_SCHEMA
Environment variable: SNOWFLAKE_ROLE
Environment variable: SNOWFLAKE_AUTHENTICATOR
Environment variable: SNOWFLAKE_PRIVATE_KEY_FILE
Environment variable: SNOWFLAKE_PRIVATE_KEY_FILE_PWD
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-nsphung-mcp-snowflake-server": {
"env": {
"SNOWFLAKE_ROLE": "your-snowflake-role-here",
"SNOWFLAKE_USER": "your-snowflake-user-here",
"SNOWFLAKE_SCHEMA": "your-snowflake-schema-here",
"SNOWFLAKE_ACCOUNT": "your-snowflake-account-here",
"SNOWFLAKE_DATABASE": "your-snowflake-database-here",
"SNOWFLAKE_PASSWORD": "your-snowflake-password-here",
"SNOWFLAKE_WAREHOUSE": "your-snowflake-warehouse-here",
"SNOWFLAKE_AUTHENTICATOR": "your-snowflake-authenticator-here",
"SNOWFLAKE_PRIVATE_KEY_FILE": "your-snowflake-private-key-file-here",
"SNOWFLAKE_PRIVATE_KEY_FILE_PWD": "your-snowflake-private-key-file-pwd-here"
},
"args": [
"mcp-snowflake-server-nsp"
],
"command": "uvx"
}
}
}From the project's GitHub README.
A Model Context Protocol (MCP) server / MCP server that connects AI assistants to Snowflake — enabling SQL queries, schema exploration, and data insights directly from your LLM client.
Highlights:
production, staging, and development environments in one file--exclude-json-results flag — reduces LLM context window usage--exclude_toolsThe fastest way to try it — using uvx with a TOML connection file:
# 1. Create a connections file
cat > ~/snowflake_connections.toml << 'EOF'
[myconn]
account = "your_account"
user = "your_user"
password = "your_password"
warehouse = "COMPUTE_WH"
database = "MY_DB"
schema = "PUBLIC"
role = "MYROLE"
EOF
# 2. Run the server
uvx --python=3.13 --from mcp-snowflake-server-nsp mcp_snowflake_server \
--connections-file ~/snowflake_connections.toml \
--connection-name myconn
Add to your MCP client config (e.g. claude_desktop_config.json) using snowflake_connections.toml:
"mcpServers": {
"snowflake": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "myconn"
]
}
}
Add to your MCP client config (e.g. .vscode/mcp.json) using .env file (see Authentication):
"snowflake": {
// Snowflake MCP server
"type": "stdio",
"command": "uvx",
"args": [
"--from", "mcp-snowflake-server-nsp",
"--python=3.13",
"mcp_snowflake_server"
],
"envFile": "${workspaceFolder}/.env"
}
Add to your MCP client config (e.g. opencode.jsonc) with .env file (see Authentication):
"snowflake": {
"type": "local",
"command": [
"uvx",
"--from",
"mcp-snowflake-server-nsp",
"--python=3.13",
"mcp_snowflake_server",
],
"enabled": true,
"timeout": 300000,
}
| URI | Description |
|---|---|
memo://insights | A continuously updated memo aggregating data insights appended via append_insight. |
context://table/{table_name} | (Prefetch mode only) Per-table schema summaries including columns and comments. |
| Tool | Description | Requires |
|---|---|---|
read_query | Execute SELECT queries. Input: query (string). | — |
write_query | Execute INSERT, UPDATE, or DELETE queries. Input: query (string). | --allow_write |
create_table | Execute CREATE TABLE statements. Input: query (string). | --allow_write |
| Tool | Description | Input |
|---|---|---|
list_databases | List all databases in the Snowflake instance. | — |
list_schemas | List all schemas within a database. | database (string) |
list_tables | List all tables within a database and schema. | database, schema (strings) |
describe_table | Describe columns of a table (name, type, nullability, default, comment). | table_name as database.schema.table |
| Tool | Description | Input |
|---|---|---|
append_insight | Add a data insight to the memo://insights resource. | insight (string) |
Set credentials via environment variables or CLI flags (see Configuration Reference):
SNOWFLAKE_USER="user@example.com"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_PASSWORD="secret"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
SNOWFLAKE_USER="user@example.com"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_PRIVATE_KEY_FILE="/absolute/path/to/key.p8"
SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase" # Optional — only if key is encrypted
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
Or via CLI: --private_key_file /path/to/key.p8 --private_key_file_pwd passphrase
SNOWFLAKE_AUTHENTICATOR="externalbrowser"
Or in a TOML connection entry: authenticator = "externalbrowser"
Manage multiple environments in a single file. See example_connections.toml for a full template.
[production]
account = "your_account"
user = "your_user"
password = "your_password"
warehouse = "COMPUTE_WH"
database = "PROD_DB"
schema = "PUBLIC"
role = "ACCOUNTADMIN"
[development]
account = "your_account"
user = "dev_user"
authenticator = "externalbrowser"
warehouse = "DEV_WH"
database = "DEV_DB"
schema = "PUBLIC"
role = "DEVELOPER"
[reporting]
account = "your_account"
user = "reporting_user"
private_key_file = "/path/to/private_key.pem"
private_key_file_pwd = "passphrase" # Optional
warehouse = "REPORTING_WH"
database = "REPORTING_DB"
schema = "REPORTS"
role = "REPORTING_ROLE"
Pass the file with --connections-file and select a profile with --connection-name. Both flags are required together.
The package is published on PyPI as mcp-snowflake-server-nsp.
"mcpServers": {
"snowflake_production": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/path/to/snowflake_connections.toml",
"--connection-name", "production"
// Optional flags — see Configuration Reference
]
},
"snowflake_staging": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/path/to/snowflake_connections.toml",
"--connection-name", "staging"
]
}
}
"mcpServers": {
"snowflake": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--account", "your_account",
"--warehouse", "your_warehouse",
"--user", "your_user",
"--password", "your_password",
"--role", "your_role",
"--database", "your_database",
"--schema", "your_schema"
// Optional: "--private_key_file", "/absolute/path/key.p8"
// Optional: "--private_key_file_pwd", "passphrase"
// Optional flags — see Configuration Reference
]
}
}
Install Visual Studio Code
Install uv:
curl -LsSf https://astral.sh/uv/install.sh | sh
Create a .env file with your Snowflake credentials (or use a TOML connection file — see Authentication):
SNOWFLAKE_USER="user@example.com"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_ROLE="MYROLE"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_PASSWORD="secret"
# Key-pair alternative:
# SNOWFLAKE_PRIVATE_KEY_FILE=/absolute/path/key.p8
# SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase"
# Browser SSO alternative:
# SNOWFLAKE_AUTHENTICATOR="externalbrowser"
(Optional) Edit runtime_config.json to exclude specific databases, schemas, or tables (see Exclusion Patterns).
Test locally:
uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server
Add to .vscode/mcp.json:
"snowflake-local": {
"type": "stdio",
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "development"
// Optional flags — see Configuration Reference
],
}
"snowflake-local": {
"type": "stdio",
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
// Optional flags — see Configuration Reference / .env.example file
],
"envFile": "/absolute/path/to/.env"
}
Install Claude AI Desktop App
Install uv:
curl -LsSf https://astral.sh/uv/install.sh | sh
Create a .env file with your Snowflake credentials (or use a TOML connection file — see Authentication):
SNOWFLAKE_USER="user@example.com"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_ROLE="MYROLE"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_PASSWORD="secret"
# Key-pair alternative:
# SNOWFLAKE_PRIVATE_KEY_FILE=/absolute/path/key.p8
# SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase"
# Browser SSO alternative:
# SNOWFLAKE_AUTHENTICATOR="externalbrowser"
(Optional) Edit runtime_config.json to exclude specific databases, schemas, or tables (see Exclusion Patterns).
Test locally:
uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server
Add to claude_desktop_config.json:
"mcpServers": {
"snowflake_local": {
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "development"
// Optional flags — see Configuration Reference
]
}
}
"mcpServers": {
"snowflake_local": {
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server"
// Optional flags — see Configuration Reference
]
}
}
A Dockerfile is included for containerised deployments:
# Build
docker build -t mcp-snowflake-server .
# Run (pass credentials as environment variables)
docker run --rm \
-e SNOWFLAKE_USER="user@example.com" \
-e SNOWFLAKE_ACCOUNT="myaccount" \
-e SNOWFLAKE_PASSWORD="secret" \
-e SNOWFLAKE_WAREHOUSE="COMPUTE_WH" \
-e SNOWFLAKE_DATABASE="MY_DB" \
-e SNOWFLAKE_SCHEMA="PUBLIC" \
-e SNOWFLAKE_ROLE="MYROLE" \
mcp-snowflake-server
# Or override the entrypoint arguments directly
docker run --rm mcp-snowflake-server \
--account your_account \
--user your_user \
--password your_password \
--warehouse COMPUTE_WH \
--database MY_DB \
--schema PUBLIC \
--role MYROLE
All connection parameters can also be set as environment variables (SNOWFLAKE_<PARAM_UPPER>).
| Flag | Env var | Default | Description |
|---|---|---|---|
--account | SNOWFLAKE_ACCOUNT | — | Snowflake account identifier |
--user | SNOWFLAKE_USER | — | Snowflake username |
--password | SNOWFLAKE_PASSWORD | — | Password (not required for key-pair / SSO) |
--warehouse | SNOWFLAKE_WAREHOUSE | — | Virtual warehouse to use |
--database | SNOWFLAKE_DATABASE | (required) | Default database |
--schema | SNOWFLAKE_SCHEMA | (required) | Default schema |
--role | SNOWFLAKE_ROLE | — | Role to assume |
--private_key_file | SNOWFLAKE_PRIVATE_KEY_FILE | — | Absolute path to .p8 private key file |
--private_key_file_pwd | SNOWFLAKE_PRIVATE_KEY_FILE_PWD | — | Passphrase for encrypted private key |
--connections-file | — | — | Path to TOML connections file |
--connection-name | — | — | Connection profile name in TOML file (required with --connections-file) |
--allow_write | — | false | Enable write_query and create_table tools |
--prefetch / --no-prefetch | — | false | Pre-load table schema as context://table/* resources (disables list_tables / describe_table) |
--exclude_tools | — | [] | Space-separated list of tool names to disable |
--exclude-json-results | — | false | Omit embedded JSON resources from responses (reduces context window usage) |
--log_dir | — | — | Directory for log file output |
--log_level | — | INFO | Log verbosity: DEBUG, INFO, WARNING, ERROR, CRITICAL |
Edit runtime_config.json to exclude databases, schemas, or tables from all discovery tools. Patterns are matched case-insensitively as substrings.
{
"exclude_patterns": {
"databases": ["temp"],
"schemas": ["temp", "information_schema"],
"tables": ["temp"]
}
}
The server loads this file automatically at startup from the working directory.
# Install dependencies (including dev tools)
make install
# Lint & auto-fix with Ruff
make ruff
# Run tests
make test
# Run tests with terminal coverage report
make coverage
# Run tests and open HTML coverage report
make coverage-html
# Run the server locally
make run
Requires uv. Dev dependencies include ruff, mypy, pytest, pytest-asyncio, pytest-cov, and pre-commit.
This project is licensed under the GNU General Public License v3.0. See the LICENSE file for the full text.
This repository is a fork of isaacwasserman/mcp-snowflake-server.
nsphung.NOTICE.Be the first to review this server!
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by Taylorwilsdon · Productivity
Control Gmail, Calendar, Docs, Sheets, Drive, and more from your AI
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.