The core idea: define workflows in YAML, run them as scheduled jobs or on-demand. Single Go binary, no external dependencies, stores everything locally.
New in this release: - Nested DAGs - workflows can call other workflows - Refactored execution history storage (10x faster for large histories) - Better debugging UI showing precondition results and variable outputs - Proper job queue management
Example workflow: ```yaml name: data-pipeline schedule: "0 10 * * *" steps: - name: fetch-data command: curl https://api.example.com/data > raw.json - name: process command: jq '.items[] | select(.active)' raw.json > processed.json depends: fetch-data - name: load command: python load_to_db.py processed.json depends: process ```
Try it: ``` docker run --rm -p 8080:8080 ghcr.io/dagu-org/dagu:1.17.0-beta.1 dagu start-all ```
Then visit http://localhost:8080
We use it in production for ETL pipelines, report generation, and system maintenance tasks. It's not trying to be Airflow - no distributed execution, no Python dependency hell, just a reliable way to run workflows.
GitHub: https://github.com/dagu-org/dagu
Would love feedback on the beta, especially around the nested DAG implementation and any performance issues you encounter.