Turn your idea into a comprehensive plan in minutes, not months.
PlanExe is the premier planning tool for AI agents.
- A business plan for a Minecraft-themed escape room.
- A business plan for a Faraday cage manufacturing company.
- A pilot project for a Human as-a Service.
- See more examples here.
PlanExe is an open-source tool and the premier planning tool for AI agents. It turns a single plain-english goal statement into a 40-page, strategic plan in ~15 minutes using local or cloud models. It's an accelerator for outlines, but no silver bullet for polished plans.
Typical output contains:
- Executive summary
- Gantt chart
- Governance structure
- Role descriptions
- Stakeholder maps
- Risk registers
- SWOT analyses
The technical quality of structure, formatting, and coherence is consistently excellent—often superior to human junior/mid-tier consulting drafts. However, budgets remain headline-only, timelines contain errors, metrics are usually vague, and legal/operational realism is weak on high-stakes topics. A usable, client-ready version still requires weeks to months of skilled human refinement.
PlanExe removes 70–90 % of the labor for the planning scaffold on any topic, but the final 10–30 % that separates a polished document from a credible, defensible plan remains human-only work.
New to PlanExe? Follow the Getting Started guide.
PlanExe exposes an MCP server for AI agents at https://mcp.planexe.org/
Assuming you have an MCP-compatible client (OpenClaw, Cursor, Codex, LM Studio, Windsurf, Inspector).
The Tool workflow (tools-only, not MCP tasks protocol)
prompt_examplestask_createtask_status(poll every 5 minutes until done)- download the result via
task_downloador viatask_file_info
- An account at https://home.planexe.org.
- Sufficient funds to create plans.
- A PlanExe API key (
pex_...) from your account
Use this endpoint directly in your MCP client:
{
"mcpServers": {
"planexe": {
"url": "https://mcp.planexe.org/mcp",
"headers": {
"X-API-Key": "pex_your_api_key_here"
}
}
}
}If you want artifacts saved directly to your disk from your MCP client, run the local proxy:
{
"mcpServers": {
"planexe": {
"command": "uv",
"args": [
"run",
"--with",
"mcp",
"/absolute/path/to/PlanExe/mcp_local/planexe_mcp_local.py"
],
"env": {
"PLANEXE_URL": "https://mcp.planexe.org/mcp",
"PLANEXE_MCP_API_KEY": "pex_your_api_key_here",
"PLANEXE_PATH": "/absolute/path/for/downloads"
}
}
}
}- Docker
- OpenRouter account
- Create a PlanExe
.envfile withOPENROUTER_API_KEY.
Start the full stack:
docker compose up --buildMake sure that you can create plans in the web interface, before proceeding to MCP.
Then connect your client to:
http://localhost:8001/mcp
For local docker defaults, auth is disabled in docker-compose.yml.
If you want artifacts saved directly to your disk from your MCP client, run the local proxy:
{
"mcpServers": {
"planexe": {
"command": "uv",
"args": [
"run",
"--with",
"mcp",
"/absolute/path/to/PlanExe/mcp_local/planexe_mcp_local.py"
],
"env": {
"PLANEXE_URL": "http://localhost:8001/mcp/",
"PLANEXE_PATH": "/absolute/path/for/downloads"
}
}
}
}- Setup overview: https://docs.planexe.org/mcp/mcp_setup/
- Tool details and flow: https://docs.planexe.org/mcp/mcp_details/
- MCP Inspector guide: https://docs.planexe.org/mcp/inspector/
- Cursor setup: https://docs.planexe.org/mcp/cursor/
- Codex setup: https://docs.planexe.org/mcp/codex/
- PlanExe MCP interface: https://docs.planexe.org/mcp/planexe_mcp_interface/
- MCP Registry publishing metadata (
server.json):mcp_cloud/server.json llms.txt: https://mcp.planexe.org/llms.txt
Run locally with Docker (Click to expand)
Prerequisite: Docker with Docker Compose installed; you only need basic Docker knowledge. No local Python setup is required because everything runs in containers.
- Clone the repo and enter it:
git clone https://github.com/PlanExeOrg/PlanExe.git
cd PlanExe-
Provide an LLM provider. Copy
.env.docker-exampleto.envand fill inOPENROUTER_API_KEYwith your key from OpenRouter. The containers mount.envandllm_config/; pick a model profile there. For host-side Ollama, use thedocker-ollama-llama3.1entry and ensure Ollama is listening onhttp://host.docker.internal:11434. -
Start the stack (first run builds the images):
docker compose up worker_plan frontend_single_userThe worker listens on http://localhost:8000 and the UI comes up on http://localhost:7860 after the worker healthcheck passes.
- Open http://localhost:7860 in your browser. Optional: set
PLANEXE_PASSWORDin.envto require a password. Enter your idea, click the generate button, and watch progress with:
docker compose logs -f worker_planOutputs are written to run/ on the host (mounted into both containers).
- Stop with
Ctrl+C(ordocker compose down). Rebuild after code/dependency changes:
docker compose build --no-cache worker_plan frontend_single_userFor compose tips, alternate ports, or troubleshooting, see docs/docker.md or docker-compose.md.
Config A: Run a model in the cloud using a paid provider. Follow the instructions in OpenRouter.
Config B: Run models locally on a high-end computer. Follow the instructions for either Ollama or LM Studio. When using host-side tools with Docker, point the model URL at the host (for example http://host.docker.internal:11434 for Ollama).
Recommendation: I recommend Config A as it offers the most straightforward path to getting PlanExe working reliably.
Screenshots (Click to expand)
You input a vague description of what you want and PlanExe outputs a plan.

