Server data from the Official MCP Registry
Static infrastructure analysis via MCP: databases, AWS services, IaC, and code patterns.
Static infrastructure analysis via MCP: databases, AWS services, IaC, and code patterns.
Infrawise is a well-architected infrastructure analysis tool with proper security practices. It enforces read-only access, uses environment variable substitution for credentials, and implements credential masking in outputs. Some minor code quality improvements are recommended, but the server's purpose and permissions are well-aligned. Supply chain analysis found 9 known vulnerabilities in dependencies (2 critical, 2 high severity). Package verification found 1 issue.
4 files analyzed · 15 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: AWS_REGION
Environment variable: AWS_ACCESS_KEY_ID
Environment variable: AWS_SECRET_ACCESS_KEY
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-sidd27-infrawise": {
"env": {
"AWS_REGION": "your-aws-region-here",
"AWS_ACCESS_KEY_ID": "your-aws-access-key-id-here",
"AWS_SECRET_ACCESS_KEY": "your-aws-secret-access-key-here"
},
"args": [
"-y",
"infrawise"
],
"command": "npx"
}
}
}From the project's GitHub README.
Understand your infrastructure, not just your code.
Infrawise gives AI coding assistants deterministic infrastructure awareness.
It statically analyzes your codebase, cloud infrastructure, and database schemas, then exposes that context through MCP so tools like Claude Code can understand your actual tables, indexes, query patterns, and service relationships instead of guessing from source files alone.
AI coding assistants can read your source files but have no deterministic knowledge of your infrastructure. They do not know which GSIs exist, how tables are partitioned, which functions already trigger scans, or where indexes are missing. So they guess.
Infrawise replaces guessing with infrastructure-aware context.
Without Infrawise, an AI assistant might:
.scan() on your Orders table that has 50M rowsstatus that you already haveSELECT * when you need to keep query cost lowWith Infrawise, it knows:
CREATE INDEX SQL or GSI config for your tables — not generic adviceInfrawise is not an AI agent framework, an infrastructure provisioning tool, an observability platform, or a cloud management dashboard.
It is a deterministic infrastructure intelligence layer for AI-assisted development.
npm install -g infrawise
or use without installing:
npx infrawise init
1. Initialize in your repo
cd your-project
infrawise init
Detects your AWS profile and region, asks a few questions, writes infrawise.yaml. That's the only file it creates in your project.
2. Validate everything is connected
infrawise doctor
3. Run analysis
infrawise analyze
Or skip this step — infrawise dev auto-runs analysis if no cache exists.
Findings (3 total)
1. [HIGH] Full table scan detected on DynamoDB table "Orders"
listAllOrders() scans without any filter — reads every item in the table.
Recommendation: Replace Scan with Query using a partition key or add a GSI.
2. [MEDIUM] PostgreSQL table "users" has no index on column "email"
Filtering on "email" causes sequential scans.
Recommendation: CREATE INDEX CONCURRENTLY idx_users_email ON users(email);
3. [MEDIUM] DynamoDB table "Sessions" accessed by 6 distinct code paths
High access concentration may create hot partition issues at scale.
infrawise dev
✔ Config loaded infrawise.yaml
✔ Cached analysis loaded 42 nodes · 18 edges · 7 finding(s)
✔ Server running
┌────────────────────────────────────────────────────┐
│ MCP Server │
├────────────────────────────────────────────────────┤
│ POST http://localhost:3000/mcp │
│ GET http://localhost:3000/health │
├────────────────────────────────────────────────────┤
│ Tools (13 active) │
│ get_infra_overview · get_graph_summary │
│ ... │
└────────────────────────────────────────────────────┘
Watching for file changes... Press Ctrl+C to stop
Claude Code — edit .claude/settings.json in your repo (project-level) or ~/.claude/settings.json (global):
{
"mcpServers": {
"infrawise": {
"url": "http://localhost:3000/mcp"
}
}
}
To let Claude Code manage the server lifecycle automatically:
{
"mcpServers": {
"infrawise": {
"command": "infrawise",
"args": ["dev"]
}
}
}
Cursor and Windsurf — add http://localhost:3000/mcp as an MCP server in editor settings.
| Tool | What it provides |
|---|---|
get_infra_overview | Complete snapshot — all services, counts, and high-severity findings |
get_graph_summary | Full infrastructure graph — all nodes, edges, and findings |
analyze_function | Issues in a specific function — scans, missing indexes, N+1 |
suggest_gsi | Exact GSI config for a DynamoDB table + attribute |
postgres_index_suggestions | Exact CREATE INDEX SQL for your actual table |
suggest_mongo_index | Exact createIndex command for a MongoDB collection + field |
mysql_index_suggestions | Exact ALTER TABLE ADD INDEX SQL for your MySQL table |
get_queue_details | SQS queues — DLQ status, encryption, message counts |
get_topic_details | SNS topics — subscription counts and protocols |
get_secrets_overview | Secrets Manager — names and rotation status (values never included) |
get_parameter_overview | SSM Parameter Store — names, types, tiers (values never included) |
get_lambda_overview | Lambda functions — runtime, memory, timeout, env var key names |
get_log_errors | CloudWatch error patterns and counts (no raw log messages) |
| Command | What it does |
|---|---|
infrawise init | Detect AWS + repo, generate infrawise.yaml |
infrawise auth | Select or switch AWS profile |
infrawise analyze | Scan repo + AWS, build graph, print findings |
infrawise dev | Start MCP server — auto-analyzes if no cache, watches files for live refresh |
infrawise doctor | Validate AWS access, DB connectivity, and config |
infrawise.yaml is generated by infrawise init and lives in your repo root. Every service must be explicitly enabled: true — infrawise never connects to anything not listed in config.
Connection strings support ${ENV_VAR} substitution so passwords never need to be committed:
postgres:
enabled: true
connectionString: postgresql://infrawise_ro:${DB_PASSWORD}@host:5432/mydb
Full example:
project: payments-service
aws:
profile: default # AWS profile from ~/.aws/credentials
region: ap-south-1
dynamodb:
enabled: true
includeTables: # omit to include all tables
- Orders
- Users
postgres:
enabled: true
connectionString: postgresql://infrawise_ro:${DB_PASSWORD}@host:5432/mydb
mysql:
enabled: false
connectionString: ""
mongodb:
enabled: false
connectionString: ""
sqs:
enabled: true
sns:
enabled: true
ssm:
enabled: true
paths: [] # filter by prefix e.g. ["/myapp/prod"]
secretsManager:
enabled: true
lambda:
enabled: true
rds:
enabled: false
kafka:
enabled: false
cloudwatchLogs:
enabled: false
logGroupPrefixes: []
windowHours: 24
analysis:
sampleSize: 100
Infrawise is read-only. Minimum IAM policy required:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"dynamodb:ListTables",
"dynamodb:DescribeTable"
],
"Resource": "*"
}
]
}
For SSO profiles, log in before running infrawise:
aws sso login --profile myprofile
Create a read-only user for infrawise:
CREATE USER infrawise_ro WITH PASSWORD 'yourpassword';
GRANT CONNECT ON DATABASE yourdb TO infrawise_ro;
GRANT USAGE ON SCHEMA public TO infrawise_ro;
GRANT SELECT ON ALL TABLES IN SCHEMA public TO infrawise_ro;
For Amazon RDS: allow inbound on port 5432 from your machine's IP in the security group.
Infrawise has two analysis layers:
Works from AWS APIs, database schema introspection, and IaC files — no dependency on application code:
| Service | What it checks |
|---|---|
| DynamoDB schema | Tables, GSIs, partition keys |
| PostgreSQL / MySQL schema | Tables, indexes, column types |
| MongoDB schema | Collections, indexes |
| SQS | Missing DLQs, unencrypted queues, large backlogs |
| Kafka (kafkajs) | Producer/consumer topic mapping from code |
| Secrets Manager | Missing secret rotation |
| Lambda | Default memory (128 MB), high timeouts |
| RDS | Publicly accessible, no backups, unencrypted, no deletion protection, single-AZ |
| CloudWatch Logs | Log groups with no retention policy |
| Terraform / CloudFormation / CDK | IaC drift vs deployed state |
Uses ts-morph AST analysis to detect which functions call which tables and how:
| Analyzer | Severity | What it detects |
|---|---|---|
| Full Table Scan (DynamoDB) | High | .scan() calls without filters |
| Missing GSI | Medium | Queries on attributes without a matching GSI |
| Hot Partition | Medium | 5+ distinct code paths hitting the same table |
| Missing Index (PostgreSQL) | Medium | Tables queried without indexes |
| N+1 Query | Medium | Repeated query patterns from ORM loops |
| Large SELECT | Low | SELECT * usage |
| Missing MySQL Index | Medium | MySQL tables queried without indexes |
| MySQL Full Table Scan | High | Full table scan patterns in MySQL queries |
| Missing Mongo Index | Medium | Collections queried without secondary indexes |
| Collection Scan | High | find() calls without filter predicates |
Non-TypeScript/JavaScript projects still get full value from infrastructure-level analyzers — code correlation (function-to-table mapping, N+1 patterns) is skipped.
The scanner supports: AWS SDK v3/v2 for DynamoDB, pg/Prisma/Knex for PostgreSQL, mysql2/Knex for MySQL, driver/Mongoose for MongoDB, AWS SDK v3 for SQS/SNS/SSM/Secrets/Lambda, and kafkajs for Kafka topics (producer/consumer).
Infrawise does not use an LLM to analyze your infrastructure. All extraction and analysis are deterministic: AST parsing, schema introspection, rule-based analyzers, and graph correlation. LLMs are only consumers of the generated context through MCP.
Your repo (any language) Your repo (TS/JS only)
│ │
│ Repository Scanner (ts-morph AST)
│ which functions → which tables
│ │
┌───────┴──────────────────────────────────┴────────────┐
│ infrawise analyze │
│ │
│ AWS APIs / DB schema / IaC files + Code ops (opt) │
│ (works for any project) (TS/JS only) │
│ │ │
│ Graph Engine │
│ (nodes + edges) │
│ │ │
│ Analyzer Engine │
│ (rule-based, deterministic) │
└─────────────────────────┬─────────────────────────────┘
│
┌──────────────────┐
│ MCP Server │ ◄── Claude Code
│ localhost:3000 │ ◄── Cursor
└──────────────────┘ ◄── Windsurf
src/
types.ts Shared type definitions
core/ Config (Zod + YAML), logger (Pino), local cache
graph/ Graph engine — nodes, edges, builder
adapters/ Flat extractors: dynamodb.ts, postgres.ts, mysql.ts,
mongodb.ts, aws.ts, logs.ts, terraform.ts
analyzers/ 23 rule-based analyzers
context/ Repository scanner (ts-morph AST)
server/ Fastify MCP server (@modelcontextprotocol/sdk, Streamable HTTP)
cli/ CLI commands (Commander.js)
Node.js 24+, pnpm 9+, AWS CLI (for integration testing).
git clone https://github.com/Sidd27/infrawise
cd infrawise
pnpm install
pnpm build
pnpm build # compile
pnpm test # run all tests
pnpm typecheck # TypeScript strict check
pnpm lint # ESLint
src/analyzers/Analyzer interface:export class MyAnalyzer implements Analyzer {
name = 'MyAnalyzer';
async analyze(graph: SystemGraph): Promise<Finding[]> { ... }
}
src/analyzers/index.tssrc/analyzers/__tests__/src/adapters/yourdb.tsPromise<YourTableMetadata[]>src/types.ts if neededsrc/cli/commands/analyze.tspnpm release patch # 0.1.2 → 0.1.3 (bug fixes)
pnpm release minor # 0.1.2 → 0.2.0 (new features, backwards compatible)
pnpm release major # 0.1.2 → 1.0.0 (breaking changes)
pnpm release 1.5.0 # explicit version
Bumps package.json, commits, tags, pushes, and creates a draft GitHub release with notes from commit messages. Then publish the draft on GitHub to trigger npm publish.
pnpm lint passespnpm typecheck passespnpm test passesMIT
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.