Role: Airflow DevOps Engineer.
Location: St. Louis MO (Onsite).
Duration: 12 Months.
Role Overview:
- Design maintain and optimize data-driven workflows using Apache Airflow that power core business operations. Combine deep Python development expertise with Airflow mastery to ensure high reliability scalability and efficiency. Support application teams through DAG development best practices workflow optimization and cluster administration.
Key Responsibilities:
Advanced Airflow Engineering:
- Develop production-grade DAGs using advanced Python including dynamic DAG generation custom operators sensors and task groups.
- Optimize existing workflows identifying performance bottlenecks resource inefficiencies and execution failures.
- Implement scalable Airflow architectures supporting high-volume mission-critical business processes.
Cluster Administration & Operations:
- Manage Airflow cluster operations (Celery Executor Kubernetes Executor multi-node deployments).
- Configure monitoring alerting and health checks ensuring 99.9% workflow availability.
- Support application teams with DAG development best practices troubleshooting and optimization guidance.
Development & Automation:
- Author custom Airflow operators hooks and sensors tailored to business requirements.
- Implement CI/CD integration for DAG deployment (GitHub Actions Jenkins GitLab CI).
- Design infrastructure as code for Airflow environments (Docker Helm Terraform).
Required Technical Expertise:
- Apache Airflow: Dynamic DAGs Custom Operators XComs Sensors TaskGroups Variables Connections.
- Python: Advanced development (DAG authoring plugins pytest type hints async).
- Executors: Celery Executor Kubernetes Executor scaling high availability.
- DevOps: Docker Kubernetes Helm charts Terraform CI/CD pipelines.
- Monitoring: Prometheus Grafana Airflow Metrics PagerDuty/Slack alerts.
Mandatory Certifications:
- CBAP Certified Business Analysis Professional.
- Apache Airflow Certification (Astronomer or official certification).
Experience Profile:
- 5 years hands-on Airflow DAG development and optimization.
- Advanced Python expertise for production Airflow environments.
- Proven Airflow cluster administration experience.
- Experience supporting application development teams.
Role: Airflow DevOps Engineer. Location: St. Louis MO (Onsite). Duration: 12 Months. Role Overview: Design maintain and optimize data-driven workflows using Apache Airflow that power core business operations. Combine deep Python development expertise with Airflow mastery to ensure high reliabil...
Role: Airflow DevOps Engineer.
Location: St. Louis MO (Onsite).
Duration: 12 Months.
Role Overview:
- Design maintain and optimize data-driven workflows using Apache Airflow that power core business operations. Combine deep Python development expertise with Airflow mastery to ensure high reliability scalability and efficiency. Support application teams through DAG development best practices workflow optimization and cluster administration.
Key Responsibilities:
Advanced Airflow Engineering:
- Develop production-grade DAGs using advanced Python including dynamic DAG generation custom operators sensors and task groups.
- Optimize existing workflows identifying performance bottlenecks resource inefficiencies and execution failures.
- Implement scalable Airflow architectures supporting high-volume mission-critical business processes.
Cluster Administration & Operations:
- Manage Airflow cluster operations (Celery Executor Kubernetes Executor multi-node deployments).
- Configure monitoring alerting and health checks ensuring 99.9% workflow availability.
- Support application teams with DAG development best practices troubleshooting and optimization guidance.
Development & Automation:
- Author custom Airflow operators hooks and sensors tailored to business requirements.
- Implement CI/CD integration for DAG deployment (GitHub Actions Jenkins GitLab CI).
- Design infrastructure as code for Airflow environments (Docker Helm Terraform).
Required Technical Expertise:
- Apache Airflow: Dynamic DAGs Custom Operators XComs Sensors TaskGroups Variables Connections.
- Python: Advanced development (DAG authoring plugins pytest type hints async).
- Executors: Celery Executor Kubernetes Executor scaling high availability.
- DevOps: Docker Kubernetes Helm charts Terraform CI/CD pipelines.
- Monitoring: Prometheus Grafana Airflow Metrics PagerDuty/Slack alerts.
Mandatory Certifications:
- CBAP Certified Business Analysis Professional.
- Apache Airflow Certification (Astronomer or official certification).
Experience Profile:
- 5 years hands-on Airflow DAG development and optimization.
- Advanced Python expertise for production Airflow environments.
- Proven Airflow cluster administration experience.
- Experience supporting application development teams.
View more
View less