Hardware probe for performance diagnostics, thermal monitoring, and LLM optimization.
Config is the same across clients — only the file and path differ.
{
"mcpServers": {
"io-github-yamaru-eu-hardware-probe": {
"command": "<see-readme>",
"args": []
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Hardware probe for performance diagnostics, thermal monitoring, and LLM optimization.
No automated test available for this server. Check the GitHub README for setup instructions.
Five weighted categories — click any category to see the underlying evidence.
No known CVEs.
No package registry to scan.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationBe the first to review
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Others in ai-ml
Dynamic problem-solving through sequential thought chains
A Model Context Protocol (MCP) server and CLI that provides tools for agent use when working on iOS and macOS projects.
An open-source AI agent that brings the power of Gemini directly into your terminal.
The Apify MCP server enables your AI agents to extract data from social media, search engines, maps, e-commerce sites, or any other website using thousands of ready-made scrapers, crawlers, and automation tools available on the Apify Store.
MCP Security Weekly
Get CVE alerts and security updates for io.github.yamaru-eu/hardware-probe and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
Expert system hardware probe and performance diagnostic engine for AI, Gaming, and High-Performance workflows. This is a Model Context Protocol (MCP) server that provides deep system insights beyond simple specifications.
gemini extension install @yamaru-eu/hardware-probe
Add this to your MCP settings file (e.g., npx-config.json or claude_desktop_config.json):
{
"mcpServers": {
"yamaru-probe": {
"command": "npx",
"args": ["-y", "@yamaru-eu/hardware-probe"]
}
}
}
analyze_local_system: Full hardware inventory.analyze_performance: Real-time performance metrics and top processes.analyze_ram_pressure: Detailed memory pressure and RSS analysis for deep RAM troubleshooting.check_storage_health: Disk SMART health, firmware, and I/O bottleneck analysis.thermal_profile: Real-time CPU/GPU thermal states, fan speeds, and frequency throttling detection.diagnose_antivirus_impact: Detects EDR/Antivirus conflicts and exclusion coverage on dev paths.monitor_system_health: Statistical health report (min/max/avg) over a specified duration.check_llm_compatibility (BETA): Predicts performance for a specific LLM model via remote API.get_llm_recommendations (BETA): Recommends the best local models via remote API.analyze_inference_config: Deep-dive into AI runtimes and environment variables.When used with Gemini CLI, this extension provides the following expert skills:
hardware-performance-expert: Global protocol for system health and troubleshooting.local-inference-optimizer: Specialized logic for fine-tuning local LLM runs.npm install # Install dependencies
npm run build # Compile TypeScript → dist/
npm run test # Run test suite
npm run inspector # Test tools in the MCP Inspector
Apache 2.0 - Part of the Yamaru Project.