I'm a CS student and intern at the California DMV. I’ve spent the last few months migrating mainframe datasets (mainly VSAM KSDS) into cloud systems like RDS/Postgres — and I kept running into the same problem: there's no sane way to orchestrate workflows across mainframe and cloud systems.
Everything is fragmented:
- Mainframe teams write JCL and use job schedulers
- Cloud/data teams build Python ETL scripts and Airflow DAGs
- Integration is slow and manual, often extensive email threads and handoffs between siloed teams
So I built Grace, a declarative orchestration toolchain that lets you define end-to-end workflows in YAML, from COBOL batch jobs to S3 uploads or shell scripts.
https://github.com/graceinfra/grace
Grace handles job orchestration across environments, including JCL templating and submission, dataset transfer, inter-job data handoff, and structured logging for each step. The goal is to expose mainframe logic in atomic, reusable steps that can be integrated into modern infrastructure pipelines.
Mainframes will be around for decades to come. I don’t think rewriting millions of lines of COBOL is a good use of time or money; the real need is tooling that allows cloud-native systems to hook cleanly into legacy logic. Grace aims to be that bridge: a shared control plane for orchestrating hybrid infrastructure, without the glue code.
Would love feedback from anyone working in legacy modernization, DevOps, or infra.