An MCP server for mcp-server-fear-greed
Valid MCP server (5 strong, 3 medium validity signals). No known CVEs in dependencies. Imported from the Official MCP Registry. 1 finding(s) downgraded by scanner intelligence.
7 files analyzed · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-ycjcl868-mcp-server-fear-greed": {
"args": [
"-y",
"mcp-server-fear-greed"
],
"command": "npx"
}
}
}From the project's GitHub README.
A Model Context Protocol (MCP) server that provides access to the CNN Fear & Greed Index for the US stock market. This server fetches real-time market sentiment data and presents it in both structuredContent and text content.
First, install the Fear & Greed MCP server with your client. A typical configuration looks like this:
{
"mcpServers": {
"mcp-server-fear-greed": {
"command": "npx",
"args": [
"-y",
"mcp-server-fear-greed@latest"
]
}
}
}
You can also install the mcp-server-fear-greed MCP server using the VS Code CLI:
# For VS Code
code --add-mcp '{"name":"mcp-server-fear-greed","command":"npx","args":["mcp-server-fear-greed@latest"]}'
After installation, the Fear & Greed MCP server will be available for use with your GitHub Copilot agent in VS Code.
Go to Cursor Settings -> MCP -> Add new MCP Server. Name to your liking, npx mcp-server-fear-greed. You can also verify config or add command like arguments via clicking Edit.
{
"mcpServers": {
"mcp-server-fear-greed": {
"command": "npx",
"args": [
"mcp-server-fear-greed@latest"
]
}
}
}
Follow Windsurf MCP documentation. Use following configuration:
{
"mcpServers": {
"mcp-server-fear-greed": {
"command": "npx",
"args": [
"mcp-server-fear-greed@latest"
]
}
}
}
Follow the MCP install guide, use following configuration:
{
"mcpServers": {
"mcp-server-fear-greed": {
"command": "npx",
"args": [
"mcp-server-fear-greed@latest"
]
}
}
}
At the same time, use --port $your_port arg to start the browser mcp can be converted into SSE and Streamable HTTP Server.
# normal run remote mcp server
npx mcp-server-fear-greed --port 8089
You can use one of the two MCP Server remote endpoint:
http://127.0.0.1::8089/mcphttp://127.0.0.1::8089/sseAnd then in MCP client config, set the url to the SSE endpoint:
{
"mcpServers": {
"mcp-server-fear-greed": {
"url": "http://127.0.0.1::8089/sse"
}
}
}
url to the Streamable HTTP:
{
"mcpServers": {
"mcp-server-fear-greed": {
"type": "streamable-http", // If there is MCP Client support
"url": "http://127.0.0.1::8089/mcp"
}
}
}
If your MCP Client is developed based on JavaScript / TypeScript, you can directly use in-process calls to avoid requiring your users to install the command-line interface to use Fear & Greed MCP.
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { InMemoryTransport } from '@modelcontextprotocol/sdk/inMemory.js';
// type: module project usage
import { createServer } from 'mcp-server-fear-greed';
// commonjs project usage
// const { createServer } = await import('mcp-server-fear-greed')
const client = new Client(
{
name: 'test fear greed client',
version: '1.0',
},
{
capabilities: {},
},
);
const server = createServer();
const [clientTransport, serverTransport] = InMemoryTransport.createLinkedPair();
await Promise.all([
client.connect(clientTransport),
server.connect(serverTransport),
]);
// list tools
const result = await client.listTools();
console.log(result);
// call tool
const toolResult = await client.callTool({
name: 'get_fear_greed_index',
arguments: {
format: 'json'
},
});
console.log(toolResult);
get_fear_greed_indexFetches the current Fear & Greed Index and related market indicators.
format (optional): Output format
"structured" (default): Returns formatted markdown with organized data"json": Returns raw JSON data// Get structured output
await client.callTool("get_fear_greed_index");
// Get JSON output
await client.callTool("get_fear_greed_index", { format: "json" });
The tool returns data in the following structure:
{
"fear_and_greed": {
"score": 75,
"rating": "greed",
"timestamp": "2025-07-18T23:59:57+00:00",
"previous_close": 75.31,
"previous_1_week": 75.26,
"previous_1_month": 54.29,
"previous_1_year": 45.94
},
"fear_and_greed_historical": {
"timestamp": 1752883197000,
"score": 75,
"rating": "greed"
},
"market_momentum_sp500": {
"timestamp": 1752871567000,
"score": 61.2,
"rating": "greed"
},
"market_momentum_sp125": {
"timestamp": 1752871567000,
"score": 61.2,
"rating": "greed"
},
"stock_price_strength": {
"timestamp": 1752883197000,
"score": 80,
"rating": "extreme greed"
},
"stock_price_breadth": {
"timestamp": 1752883197000,
"score": 84,
"rating": "extreme greed"
},
"put_call_options": {
"timestamp": 1752871897000,
"score": 79.6,
"rating": "extreme greed"
},
"market_volatility_vix": {
"timestamp": 1752869701000,
"score": 50,
"rating": "neutral"
},
"market_volatility_vix_50": {
"timestamp": 1752869701000,
"score": 50,
"rating": "neutral"
},
"junk_bond_demand": {
"timestamp": 1752877800000,
"score": 88.8,
"rating": "extreme greed"
},
"safe_haven_demand": {
"timestamp": 1752868799000,
"score": 81.4,
"rating": "extreme greed"
}
}
The index uses the following rating scale:
Access http://127.0.0.1:6274/:
npm run dev
The server includes comprehensive error handling:
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Web content fetching and conversion for efficient LLM usage
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally