Location: Hyderabad
Interview Mode: Video Interview
Shift: General Shift
Roles & Responsibilities 1. Orchestration & Workflow Engineering -
Design develop and implement Apache Airflow DAGs for end-to-end orchestration and data processing pipelines.
-
Enable metadata-driven orchestration for scalable and dynamic data ingestion and transformation processes.
-
Manage optimize and troubleshoot MWAA (Managed Workflows for Apache Airflow) environments on AWS.
2. Cloud Engineering & Infrastructure Automation -
Build configure and deploy environments using AWS CloudFormation templates.
-
Integrate Airflow with major AWS services such as S3 Lambda Glue CloudWatch and modern data platforms like Snowflake.
-
Enable elastic scaling of Airflow clusters to support varying workload demands.
-
Implement CloudWatch-based logging metrics and alerting for monitoring DAG performance and operational usage.
3. DevOps CI/CD & Quality Engineering -
Develop manage and optimize CI/CD pipelines using Jenkins for deployment automation.
-
Integrate with code quality and security tools such as SonarQube and Snyk to enforce engineering best practices.
-
Set up and maintain local development environments for unit testing debugging and workflow validation.
4. Monitoring Governance & Utility Development -
Build utilities and frameworks for auditability monitoring control and operational governance in orchestration workflows.
-
Define and implement pipeline processing metrics SLAs and quality checks to measure performance and reliability.
5. Implementation & Delivery -
Participate in requirement analysis design discussions and implementation planning.
-
Ensure high-quality delivery through best practices in coding testing and deployment.
-
Collaborate with cross-functional teams to ensure seamless pipeline operations and continuous improvements.
Required Skills & Qualifications -
Strong proficiency in Python and Airflow DAG development.
-
Extensive hands-on experience with MWAA and AWS services (S3 Lambda Glue CloudWatch).
-
Experience integrating with Snowflake or similar modern data platforms.
-
Strong understanding of metadata-driven orchestration and data modeling concepts.
-
Proven expertise in Jenkins CI/CD automation and security tool integrations (SonarQube Snyk).
-
Experience using AWS CloudFormation for infrastructure provisioning.
-
Solid understanding of DevOps principles code quality and secure development practices.
-
Excellent problem-solving debugging and performance optimization skills.
Nice-to-Have Skills Soft Skills -
Strong leadership qualities with the ability to mentor junior team members.
-
Excellent communication and stakeholder management skills.
-
Ability to work effectively in a global delivery model managing collaboration with offshore/onshore teams.
Location: Hyderabad Interview Mode: Video Interview Shift: General Shift Roles & Responsibilities 1. Orchestration & Workflow Engineering Design develop and implement Apache Airflow DAGs for end-to-end orchestration and data processing pipelines. Enable metadata-driven orchestration for scalable...
Location: Hyderabad
Interview Mode: Video Interview
Shift: General Shift
Roles & Responsibilities 1. Orchestration & Workflow Engineering -
Design develop and implement Apache Airflow DAGs for end-to-end orchestration and data processing pipelines.
-
Enable metadata-driven orchestration for scalable and dynamic data ingestion and transformation processes.
-
Manage optimize and troubleshoot MWAA (Managed Workflows for Apache Airflow) environments on AWS.
2. Cloud Engineering & Infrastructure Automation -
Build configure and deploy environments using AWS CloudFormation templates.
-
Integrate Airflow with major AWS services such as S3 Lambda Glue CloudWatch and modern data platforms like Snowflake.
-
Enable elastic scaling of Airflow clusters to support varying workload demands.
-
Implement CloudWatch-based logging metrics and alerting for monitoring DAG performance and operational usage.
3. DevOps CI/CD & Quality Engineering -
Develop manage and optimize CI/CD pipelines using Jenkins for deployment automation.
-
Integrate with code quality and security tools such as SonarQube and Snyk to enforce engineering best practices.
-
Set up and maintain local development environments for unit testing debugging and workflow validation.
4. Monitoring Governance & Utility Development -
Build utilities and frameworks for auditability monitoring control and operational governance in orchestration workflows.
-
Define and implement pipeline processing metrics SLAs and quality checks to measure performance and reliability.
5. Implementation & Delivery -
Participate in requirement analysis design discussions and implementation planning.
-
Ensure high-quality delivery through best practices in coding testing and deployment.
-
Collaborate with cross-functional teams to ensure seamless pipeline operations and continuous improvements.
Required Skills & Qualifications -
Strong proficiency in Python and Airflow DAG development.
-
Extensive hands-on experience with MWAA and AWS services (S3 Lambda Glue CloudWatch).
-
Experience integrating with Snowflake or similar modern data platforms.
-
Strong understanding of metadata-driven orchestration and data modeling concepts.
-
Proven expertise in Jenkins CI/CD automation and security tool integrations (SonarQube Snyk).
-
Experience using AWS CloudFormation for infrastructure provisioning.
-
Solid understanding of DevOps principles code quality and secure development practices.
-
Excellent problem-solving debugging and performance optimization skills.
Nice-to-Have Skills Soft Skills -
Strong leadership qualities with the ability to mentor junior team members.
-
Excellent communication and stakeholder management skills.
-
Ability to work effectively in a global delivery model managing collaboration with offshore/onshore teams.
View more
View less