Server data from the Official MCP Registry
Hardware probe for performance diagnostics, thermal monitoring, and LLM optimization.
Hardware probe for performance diagnostics, thermal monitoring, and LLM optimization.
Valid MCP server (3 strong, 4 medium validity signals). 7 known CVEs in dependencies (1 critical, 6 high severity) Package registry verified. Imported from the Official MCP Registry.
9 files analyzed ยท 8 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Unverified package source
We couldn't verify that the installable package matches the reviewed source code. Proceed with caution.
Set these up before or after installing:
Environment variable: REMOTE_API_URL
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-yamaru-eu-hardware-probe": {
"env": {
"REMOTE_API_URL": "your-remote-api-url-here"
},
"args": [
"-y",
"@yamaru-eu/hardware-probe"
],
"command": "npx"
}
}
}From the project's GitHub README.
Expert system hardware probe and performance diagnostic engine for AI, Gaming, and High-Performance workflows. This is a Model Context Protocol (MCP) server that provides deep system insights beyond simple specifications.
gemini extension install @yamaru-eu/hardware-probe
Add this to your MCP settings file (e.g., npx-config.json or claude_desktop_config.json):
{
"mcpServers": {
"yamaru-probe": {
"command": "npx",
"args": ["-y", "@yamaru-eu/hardware-probe"]
}
}
}
analyze_local_system: Full hardware inventory.analyze_performance: Real-time performance metrics and top processes.analyze_ram_pressure: Detailed memory pressure and RSS analysis for deep RAM troubleshooting.check_storage_health: Disk SMART health, firmware, and I/O bottleneck analysis.thermal_profile: Real-time CPU/GPU thermal states, fan speeds, and frequency throttling detection.diagnose_antivirus_impact: Detects EDR/Antivirus conflicts and exclusion coverage on dev paths.monitor_system_health: Statistical health report (min/max/avg) over a specified duration.check_llm_compatibility (BETA): Predicts performance for a specific LLM model via remote API.get_llm_recommendations (BETA): Recommends the best local models via remote API.analyze_inference_config: Deep-dive into AI runtimes and environment variables.When used with Gemini CLI, this extension provides the following expert skills:
hardware-performance-expert: Global protocol for system health and troubleshooting.local-inference-optimizer: Specialized logic for fine-tuning local LLM runs.npm install # Install dependencies
npm run build # Compile TypeScript โ dist/
npm run test # Run test suite
npm run inspector # Test tools in the MCP Inspector
Apache 2.0 - Part of the Yamaru Project.
Be the first to review this server!
by Modelcontextprotocol ยท Developer Tools
Web content fetching and conversion for efficient LLM usage
by Modelcontextprotocol ยท Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno ยท Developer Tools
Toleno Network MCP Server โ Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace ยท Developer Tools
Create, build, and publish Python MCP servers to PyPI โ conversationally.