{
"mcpServers": {
"cerebro": {
"command": "<see-readme>",
"args": []
}
}
}No install config available. Check the server's README for setup instructions.
Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
A model-agnostic MCP Client-Server for .Net and Unity
Is it safe?
No package registry to scan.
No authentication — any process on your machine can connect.
MIT. View license →
Is it maintained?
Last commit 367 days ago. 17 stars.
Will it work with my client?
Transport: stdio. Works with Claude Desktop, Cursor, Claude Code, and most MCP clients.
No automated test available for this server. Check the GitHub README for setup instructions.
No known vulnerabilities.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationHave you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Persistent memory using a knowledge graph
Privacy-first. MCP is the protocol for tool access. We're the virtualization layer for context.
Pre-build reality check. Scans GitHub, HN, npm, PyPI, Product Hunt — returns 0-100 signal.
Monitor browser logs directly from Cursor and other MCP compatible IDEs.
MCP Security Weekly
Get CVE alerts and security updates for CereBro and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
CereBro is a model-agnostic AI Agent Wrapper for .Net. Now with 🔥 Model Context Protocol 🔥 , based on the Official C# SDK, you can write Tools that can be used with different AI models without changing the code.
Below is a list of supported and planned models for CereBro.
Supported:
Planned:
You can install the package from NuGet using the following command:
dotnet add package Rob1997.CereBro
dotnet add package Rob1997.CereBro.Open-AI
servers.json fileThis file will contain the configuration for the MCP servers you want to use. Below is an example of the servers.json file.
[
{
"Id": "everything-server",
"Name": "Everything",
"TransportType": "stdio",
"TransportOptions": {
"command": "npx",
"arguments": "-y @modelcontextprotocol/server-everything"
}
}
]
You can check out more servers here.
export OPEN_AI_API_KEY="your-api-key"
$env:OPEN_AI_API_KEY="your-api-key"
If you want this to be permanent, you can add it to your .bashrc or .bash_profile file in linux or use the following command in PowerShell.
[Environment]::SetEnvironmentVariable("OPEN_AI_API_KEY", "your-api-key", "User")
Program.cs (Entry Point)public static async Task Main(string[] args)
{
var builder = Host.CreateApplicationBuilder(args);
builder.Services.UseOpenAI(Environment.GetEnvironmentVariable("OPEN_AI_API_KEY"), "gpt-4o-mini");
IHost cereBro = builder.BuildCereBro(new CereBroConfig{ ServersFilePath = "./servers.json" });
await cereBro.RunAsync();
}
CereBro uses the Console as a chat dispatcher. You can create your own dispatcher by implementing the IChatDispatcher interface and use builder.BuildCereBro<IChatDispatcher>(config) to build CereBro's host.
dotnet run
Currently, CereBro only supports OpenAI's models. To add a new model you'll need to Implement Microsoft.Extensions.AI.IChatClient, unless it already exists, Microsoft already has implementations for some models like OpenAI and Ollama.
Once you've done that you can create a Placeholder Type that implements Microsoft.Extensions.AI.FunctionInvokingChatClient something like this.
Finally, you can use the UseChatClient<T>(this IServiceCollection services, IChatClient chatClient) where T : FunctionInvokingChatClient extension method to add your model to the service collection.
⚠️ Note ⚠️
At the moment CereBro doesn't support multiple models at the same time,