AWS Orchestrator Configuration
Complete configuration reference for the AWS Orchestrator Agent. The absolute best way to get started with the AWS Orchestrator is using the Docker Compose approach. It natively links the Deep Agent backend with the TalkOps UI frontend and manages the sandbox environment automatically.
Environment Variables: The Recommended Setup​
At the root of your project, you must define the environment variable linkages that power the Three-Tier LLM Architecture and the MCP backends. Create a .env file based on .env.example:
General / Core MCP​
GOOGLE_API_KEY="your-google-api-key"
GITHUB_PERSONAL_ACCESS_TOKEN="your-github-personal-access-token"
GITHUB_MCP_URL=https://api.githubcopilot.com/mcp
# Physical path for modules; virtual agent path is /workspace/terraform_modules/
TERRAFORM_WORKSPACE=./workspace/terraform_modules
ENVIRONMENT=development
The Three-Tier LLM Architecture​
To balance performance, cost constraints, and deep cognitive reasoning, the Orchestrator splits LLM mappings across three distinct tiers. The default recommended provider is google_genai.
Tier 1: Standard LLM Used for fast, cost-effective evaluation processes such as the validation loops and minor parsing.
LLM_PROVIDER="google_genai"
LLM_MODEL="gemini-3.1-flash-lite-preview"
LLM_TEMPERATURE=0.0
LLM_MAX_TOKENS=15000
Tier 2: Higher-tier LLM Used structurally by the Planner graph and the core Supervisor routing to ensure requests are routed accurately.
LLM_HIGHER_PROVIDER="google_genai"
LLM_HIGHER_MODEL="gemini-3.1-pro-preview"
LLM_HIGHER_TEMPERATURE=0.0
LLM_HIGHER_MAX_TOKENS=15000
Tier 3: DeepAgent LLM Used fundamentally by the Coordinator and the complex multi-step code generation sequence. Requires high context retention.
LLM_DEEPAGENT_PROVIDER="google_genai"
LLM_DEEPAGENT_MODEL="gemini-3.1-pro-preview"
LLM_DEEPAGENT_TEMPERATURE=1.0
LLM_DEEPAGENT_MAX_TOKENS=25000
Note on Providers: The codebase relies on standard abstractions. If you wish to use OpenAI or Anthropic, simply change the provider keys (e.g.,
LLM_PROVIDER="openai") and replace theGOOGLE_API_KEYmapped variable withOPENAI_API_KEY.
Quick Start with Docker Compose (Recommended)​
No cloning required. You just need two files: docker-compose.yml and .env.
1. Create docker-compose.yml​
services:
aws-orchestrator:
image: sandeep2014/aws-orchestrator-agent:latest
container_name: aws-orchestrator
ports:
- "10102:10102"
environment:
- GOOGLE_API_KEY=${GOOGLE_API_KEY}
- GITHUB_PERSONAL_ACCESS_TOKEN=${GITHUB_PERSONAL_ACCESS_TOKEN}
- TERRAFORM_WORKSPACE=${TERRAFORM_WORKSPACE}
- ENVIRONMENT=${ENVIRONMENT:-development}
- LLM_PROVIDER=${LLM_PROVIDER:-google_genai}
- LLM_MODEL=${LLM_MODEL:-gemini-3.1-flash-lite-preview}
- LLM_HIGHER_PROVIDER=${LLM_HIGHER_PROVIDER:-google_genai}
- LLM_HIGHER_MODEL=${LLM_HIGHER_MODEL:-gemini-3.1-pro-preview}
- LLM_DEEPAGENT_PROVIDER=${LLM_DEEPAGENT_PROVIDER:-google_genai}
- LLM_DEEPAGENT_MODEL=${LLM_DEEPAGENT_MODEL:-gemini-3.1-pro-preview}
volumes:
# Explicit mount required for the sync_workspace_to_disk validation tool
- ./workspace:/app/workspace
restart: unless-stopped
networks:
- aws-orchestrator-net
talkops-ui:
image: talkopsai/talkops:latest
container_name: talkops-ui
environment:
- TALKOPS_AWS_ORCHESTRATOR_URL=http://aws-orchestrator:10102
ports:
- "8080:80"
depends_on:
- aws-orchestrator
restart: unless-stopped
networks:
- aws-orchestrator-net
networks:
aws-orchestrator-net:
driver: bridge
2. Start the Stack​
Once your docker-compose.yml and .env are configured natively inside the root directory:
- Ensure your Docker daemon is running on your host.
- Run the stack directly using compose:
docker compose up -d - Open your browser to
http://localhost:8080to access the A2UI-powered TalkOps Interface and begin chatting with the orchestrator.
Customization and Agent Policies​
To enforce internal compliance standards without changing Python code, simply edit the hitl-policies.md and AGENTS.md files mapped into the Coordinator memory store at boot. These configurations define what commands execute securely behind HITL approval gates.