Thanks to visit codestin.com
Credit goes to docs.dagu.cloud

Skip to content
Dagu

Lightweight and powerful workflow engine

Define workflows in YAML. Execute with a single binary. No database or message broker required. Ideal for VMs, containers, and bare metal.

Demo

CLI: Execute workflows from the command line.

CLI Demo

Web UI: Monitor, control, and debug workflows visually.

Web UI Demo

Try It Live

Explore without installing: Live Demo

Credentials: demouser / demouser

Why Dagu?

Quick Start

Install

bash
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash
powershell
irm https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.ps1 | iex
cmd
curl -fsSL https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.cmd -o installer.cmd && .\installer.cmd && del installer.cmd
bash
docker run --rm -v ~/.dagu:/var/lib/dagu -p 8080:8080 ghcr.io/dagu-org/dagu:latest dagu start-all
bash
brew install dagu
bash
npm install -g --ignore-scripts=false @dagu-org/dagu

Create a Workflow

bash
cat > hello.yaml << 'EOF'
steps:
  - command: echo "Hello from Dagu!"
  - command: echo "Step 2"
EOF

Run

bash
dagu start hello.yaml

Start Web UI

bash
dagu start-all

Visit http://localhost:8080

Key Capabilities

CapabilityDescription
Nested WorkflowsReusable sub-DAGs with full execution lineage tracking
Distributed ExecutionLabel-based worker routing with automatic service discovery
Error HandlingExponential backoff retries, lifecycle hooks, continue-on-failure
Step TypesShell, Docker, SSH, HTTP, JQ, Mail, and more
ObservabilityLive logs, Gantt charts, Prometheus metrics, OpenTelemetry
SecurityBuilt-in RBAC with admin, manager, operator, and viewer roles

Example

A data pipeline with scheduling, parallel execution, sub-workflows, and retry logic:

yaml
schedule: "0 2 * * *"
type: graph

steps:
  - name: extract
    command: python extract.py --date=${DATE}
    output: RAW_DATA

  - name: transform
    call: transform-workflow
    params: "INPUT=${RAW_DATA}"
    depends: extract
    parallel:
      items: [customers, orders, products]

  - name: load
    command: python load.py
    depends: transform
    retryPolicy:
      limit: 3
      intervalSec: 10

handlerOn:
  success:
    command: notify.sh "Pipeline succeeded"
  failure:
    command: alert.sh "Pipeline failed"

See Examples for more patterns.

Use Cases

  • Data Pipelines - ETL/ELT with complex dependencies and parallel processing
  • ML Workflows - GPU/CPU worker routing for training and inference
  • Deployment Automation - Multi-environment rollouts with approval gates
  • Legacy Migration - Wrap existing scripts without rewriting them

Quick Links: Overview | CLI | Web UI | API | Architecture

Learn More

Overview

What is Dagu and how it works

Getting Started

Installation and first workflow

Writing Workflows

Complete workflow authoring guide

YAML Reference

All configuration options

Features

Scheduling, queues, distributed execution

Configuration

Server, authentication, operations

Community

Released under the MIT License.