Break free from DSL hell. Dimensigon orchestrations speak your language: Shell, Python, HTTP/REST, or nested workflows. One engine. Every language.
A financial services company needs to deploy applications across dev, staging, and production environments with language-agnostic workflows.
{
"name": "deploy-financial-app",
"version": 1,
"steps": [
{
"name": "validate-deployment",
"action_type": "shell",
"code": "docker pull myapp:latest && \
docker inspect myapp:latest | jq '.RepoTags'",
"target": "prod-servers"
},
{
"name": "check-compliance",
"action_type": "python",
"code": "import hashlib\nwith open('/app/binary', 'rb') as f:\n sha256 = hashlib.sha256(f.read()).hexdigest()\n assert sha256 == '{{vault.expected_hash}}', 'Binary mismatch!'",
"timeout": 30
},
{
"name": "notify-audit",
"action_type": "http",
"method": "POST",
"url": "https://audit.internal/api/v1/deployments",
"headers": {
"Authorization": "Bearer {{vault.audit_token}}"
},
"body": {
"environment": "production",
"version": "{{vault.app_version}}",
"timestamp": "2025-02-20T06:00:00Z"
}
}
]
}
No custom DSL. No limitations. Use the right tool for each job.
Define base orchestrations, then compose them into complex workflows without duplication:
health-check.json - Generic health check across any servicedeploy-service.json - Deploy + health check + rollbackfull-pipeline.json - Multi-service orchestration calling the aboveOrchestra 10 services with 3 nested calls instead of 100 steps.
// health-check.json (reusable)
{
"name": "health-check",
"steps": [
{
"code": "curl -sf {{service_url}}/health",
"action_type": "shell"
}
]
}
// full-pipeline.json (composes 3 services)
{
"name": "deploy-all",
"steps": [
{
"name": "deploy-web",
"action_type": "orchestration",
"orchestration_id": "deploy-service",
"params": {
"service": "web-api"
}
},
{
"name": "deploy-worker",
"action_type": "orchestration",
"orchestration_id": "deploy-service",
"params": {
"service": "worker"
}
}
]
}
// Orchestration combining all languages
{
"name": "etl-pipeline",
"steps": [
{
"name": "extract-data",
"action_type": "shell",
"code": "aws s3 sync s3://data-lake /tmp/data --profile prod"
},
{
"name": "transform",
"action_type": "python",
"code": "import pandas as pd\ndf = pd.read_csv('/tmp/data/raw.csv')\ndf['timestamp'] = pd.to_datetime(df['timestamp'])\ndf.to_parquet('/tmp/data/transformed.parquet')"
},
{
"name": "load",
"action_type": "http",
"method": "POST",
"url": "https://dw.internal/api/ingest",
"files": {
"data": "/tmp/data/transformed.parquet"
}
}
]
}
No duct-taping languages together. Everything runs in the orchestration engine with shared vault variables.
Orchestrations maintain shared state across language boundaries. A Shell step stores data, Python reads it, HTTP submits itβall in one workflow.
"steps": [
{ "action_type": "shell", "code": "echo 'processed_count=42' > /tmp/state.txt" },
{ "action_type": "python", "code": "state = open('/tmp/state.txt').read(); print(f'Processed: {state}')" },
{ "action_type": "http", "body": { "metric": "{{vault.state}}" } }
]
Run Shell and Python steps in parallel on different nodes while maintaining consistency through quorum-based locking.
"steps": [
{ "parallel": "true", "steps": [
{ "action_type": "shell", "target": "nodes-1-5", "code": "backup-database.sh" },
{ "action_type": "python", "target": "nodes-6-10", "code": "validate-integrity.py" }
]}
]
Branch orchestrations based on conditions evaluated in Python, then execute appropriate Shell or HTTP actions.
"steps": [
{ "action_type": "python", "name": "check-quota",
"code": "usage = check_metrics(); \nif usage > 80: return {'proceed': True}" },
{ "action_type": "shell", "if": "check-quota.proceed",
"code": "scale-up-cluster.sh" }
]
Route Shell scripts to Linux nodes, Python to Kubernetes pods, HTTP to cloud APIsβall from one orchestration.
"steps": [
{ "action_type": "shell", "target": "aws-ec2",
"code": "aws elbv2 describe-load-balancers" },
{ "action_type": "python", "target": "gcp-k8s",
"code": "kubectl get nodes -o json" },
{ "action_type": "http",
"url": "https://api.azure.com/subscriptions/..." }
]
Shell for sysadmin, Python for data, HTTP for APIs. No compromises.
Build libraries of reusable orchestrations. Compose them at scale.
Teams use languages they already know. No DSL to master.
All languages get secret injection and variable substitution automatically.