The Solutions Architect specializing in Cloud Orchestration will work with a small group of technologists and become a trusted advisor delivering technical solutions leveraging Apache Airflow hosted on the Astronomer this role he will engineer a wide range of use cases where Apache Airflow sits at the center of the solution with the goal of rapidly developing solutions for the use cases.
- Help guide a small group of technologists in their Apache Airflow journeys including identifying new use cases and onboarding new domain teams.
- Review optimize and tune data ingestion and extraction pipelines orchestrated by Airflow.
- Development of frameworks to manage and engineer a large number of pipelines for the entire domain.
- Build architecture data flow and operational diagrams and documents with detailed physical and logical layers.
- Provide reference implementations of various activities including composing data pipelines in Airflow implementing new Airflow features or integrating Airflow with 3rd party solutions.
- Keep up with the latest Astro and Airflow features in order to better advise the technologies on impactful improvements they can make.
- Collaborate to build reusable assets automation tools and documented best practices.
- Interact with Domain and Engineering teams to channel product feedback and requirements discussions.
- Work with technology team members to ensure that are realizing value in their Airflow and Astronomer journeys
Skills Required:
- Experience with Apache Airflow in production environments.
- Experience in designing and implementing ETL Data Warehousing and ML/AI use cases.
- Proficiency in Python.
- Knowledge of Azure cloud-native data architecture.
- Demonstrated technical leadership on team projects.
- Strong oral and written communication skills.
- Willingness to learn new technologies and build reference implementations.
- Experience in migrating workflows from legacy schedulers (Tidal) to Apache Airflow.
- Experience in integrating with Azure Data Factory pipelines.
- Experience
- Snowflake and Databricks experience.
- Experience working with containers.
- SQL experience.
- Kubernetes experience either on-premise or in the cloud.
- Enterprise data experience in regulated environments.