Installation
Prerequisites
You need exactly two things:
- Docker + Docker Compose — Install Docker Desktop (includes Compose)
- An LLM API key — OpenRouter is recommended (one key, access to all models)
That’s it. No Node.js, no Python, no database setup. Docker handles everything.
Clone & Configure
git clone https://github.com/BaseDatum/djinnbot.git
cd djinnbot
cp .env.example .envOpen .env in your editor and set your API key:
# Required — this is the only thing you must set
OPENROUTER_API_KEY=sk-or-v1-your-key-hereOptional: Encryption Key
For production deployments, generate a secrets encryption key:
# Generate and add to .env
python3 -c "import secrets; print('SECRET_ENCRYPTION_KEY=' + secrets.token_hex(32))" >> .envThis encrypts user-defined secrets (API keys, SSH keys, etc.) at rest. Without it, secrets are encrypted with an ephemeral key that resets on restart.
Optional: Internal Token
Protects the plaintext secrets endpoint from unauthorized access:
python3 -c "import secrets; print('ENGINE_INTERNAL_TOKEN=' + secrets.token_urlsafe(32))" >> .envWithout this, the endpoint that returns decrypted secrets is open to anyone who can reach the API. See Security Model for details.
Start Services
docker compose up -dThis starts 6 services:
| Service | Container | Port | Purpose |
|---|---|---|---|
| PostgreSQL | djinnbot-postgres | 5432 | State database |
| Redis | djinnbot-redis | 6379 | Event bus (Redis Streams) |
| API Server | djinnbot-api | 8000 | REST API (FastAPI) |
| Pipeline Engine | djinnbot-engine | — | Orchestrates agent execution |
| Dashboard | djinnbot-dashboard | 3000 | React web interface |
| MCP Proxy | djinnbot-mcpo | 8001 | Tool server proxy |
Check that everything is healthy:
docker compose psYou should see all services running with healthy status.
Verify
Open the dashboard:
http://localhost:3000Check the API:
curl http://localhost:8000/v1/statusYou should see a JSON response with "status": "ok" and connected service counts.
What Just Happened
Docker Compose built and started the entire stack:
- PostgreSQL stores pipeline runs, steps, agent state, project boards, and settings
- Redis provides the event bus via Redis Streams — reliable, ordered message delivery between services
- API Server (FastAPI/Python) exposes REST endpoints for the dashboard, CLI, and external integrations
- Pipeline Engine (TypeScript/Node) runs the state machine that coordinates agent execution, spawns agent containers, manages memory, and bridges Slack
- Dashboard (React/Vite) serves the web interface with real-time SSE streaming
- mcpo proxies MCP tool servers (GitHub, web fetch, etc.) as OpenAPI endpoints for agents
When a pipeline runs, the engine dynamically spawns agent containers — isolated Docker containers with a full engineering toolbox — for each step. These are separate from the 6 core services and are created/destroyed per step.