Minimal stdio MCP server for parallel task execution by AI agents.
Config is the same across clients — only the file and path differ.
{
"mcpServers": {
"io-github-s3brr-agent-tasker-mcp": {
"command": "<see-readme>",
"args": []
}
}
}Are you the author?
Add this badge to your README to show your security score and help users find safe servers.
Minimal stdio MCP server for parallel task execution by AI agents.
No automated test available for this server. Check the GitHub README for setup instructions.
Five weighted categories — click any category to see the underlying evidence.
No known CVEs.
No package registry to scan.
This server is missing a description. Tools and install config are also missing.If you've used it, help the community.
Add informationBe the first to review
Have you used this server?
Share your experience — it helps other developers decide.
Sign in to write a review.
Others in ai-ml
Dynamic problem-solving through sequential thought chains
A Model Context Protocol (MCP) server and CLI that provides tools for agent use when working on iOS and macOS projects.
The official Python SDK for Model Context Protocol servers and clients
An open-source AI agent that brings the power of Gemini directly into your terminal.
MCP Security Weekly
Get CVE alerts and security updates for io.github.S3bRR/agent-tasker-mcp and similar servers.
Start a conversation
Ask a question, share a tip, or report an issue.
Sign in to join the discussion.
AgentTasker is a small, stdio-only MCP server for AI agents that need to run multiple tasks quickly and get structured results back in one call.
It is intentionally narrow:
execute and execute_batchdepends_onRepository: https://github.com/S3bRR/agent-tasker-mcp
Most agent orchestration layers are heavier than they need to be. This project is designed for the common case:
There is no queue service, no persistence layer, no background worker system, and no SDK dependency required at runtime.
Task types:
python_codehttp_requestdiscovery_searchweb_scrapeshell_commandfile_readfile_writePublic MCP tools:
executeexecute_batchuvxOnce the package is live on PyPI:
uvx agent-tasker-mcp-server --workers 8
Until then, run directly from GitHub:
uvx --from git+https://github.com/S3bRR/agent-tasker-mcp.git agent-tasker-mcp-server --workers 8
pipxOnce the package is live on PyPI:
pipx install agent-tasker-mcp-server
Until then:
pipx install git+https://github.com/S3bRR/agent-tasker-mcp.git
git clone https://github.com/S3bRR/agent-tasker-mcp.git
cd agent-tasker-mcp
./setup.sh
{
"command": "uvx",
"args": ["agent-tasker-mcp-server", "--workers", "8"]
}
{
"command": "uvx",
"args": [
"--from",
"git+https://github.com/S3bRR/agent-tasker-mcp.git",
"agent-tasker-mcp-server",
"--workers",
"8"
]
}
{
"command": "/absolute/path/to/agent-tasker-mcp/.venv/bin/agent-tasker-mcp-server",
"args": ["--workers", "8"]
}
executeRun one task immediately.
{
"task_type": "python_code",
"code": "result = 6 * 7"
}
execute_batchRun multiple tasks concurrently.
{
"tasks": [
{
"name": "fetch_users",
"task_type": "http_request",
"url": "https://api.example.com/users"
},
{
"name": "calc",
"task_type": "python_code",
"code": "result = 6 * 7"
}
],
"output_mode": "compact"
}
depends_onIf one task must wait for another, make it explicit.
{
"tasks": [
{
"name": "write_file",
"task_type": "file_write",
"path": "/tmp/example.txt",
"content": "hello"
},
{
"name": "read_file",
"task_type": "file_read",
"path": "/tmp/example.txt",
"depends_on": ["write_file"]
}
]
}
If an upstream dependency fails, downstream tasks are marked failed and do not run.
output_mode supports:
compact (default)fullThe response is ordered to match the input task list, which makes it easier for models to consume without extra reconciliation logic.
Releases are tag-driven.
pyproject.toml and server.json to the same versionmainv1.0.0server.json to the MCP RegistryThe release workflow rejects version drift: the pushed tag, pyproject.toml, and server.json must match exactly.
Optional environment variables:
AGENT_TASKER_MAX_TASKS: maximum tasks per execute_batchAGENT_TASKER_MAX_PAYLOAD_BYTES: maximum payload size per taskAGENT_TASKER_MAX_MEMORY_MB: soft process memory guar