Data Engineer Snowflake and AWS
Location: Melbourne Australia Contract
Duration: 6 months (with possible extension)
About the Role
We are seeking an experienced Data Engineer with strong expertise in Airflow AWS and Snowflake to support the delivery of scalable and high-performance data solutions. This role involves building robust data pipelines optimizing workflows and working closely with cross-functional teams in a fast-paced environment.
Key Responsibilities
- Design develop and maintain scalable data pipelines using Apache Airflow
- Build and manage data workflows leveraging AWS services (primarily ECS and S3)
- Develop optimize and manage data solutions in Snowflake
- Implement efficient ELT/ETL processes for large-scale data processing
- Monitor pipeline performance and troubleshoot data-related issues
- Collaborate with stakeholders to understand data requirements and deliver solutions
- Ensure data quality reliability and governance standards are maintained
Key Skills & Experience (Must-Have)
- Strong hands-on experience with Airflow (primary skill)
- Solid experience with AWS (ECS S3 primary focus)
- Proven expertise in Snowflake (data development & optimization)
- Advanced SQL skills and strong understanding of data modeling
- Experience building and managing end-to-end data pipelines
Good to Have
- Experience with Terraform (Infrastructure as Code)
- Exposure to CI/CD tools such as GitHub Actions or Buildkite
- Scripting experience in Python or similar languages
Domain Experience (Preferred)
- Prior experience working on Finance / Banking domain projects will be an added advantage
Data Engineer Snowflake and AWSLocation: Melbourne Australia ContractDuration: 6 months (with possible extension) About the Role We are seeking an experienced Data Engineer with strong expertise in Airflow AWS and Snowflake to support the delivery of scalable and high-performance data solutions. Th...
Data Engineer Snowflake and AWS
Location: Melbourne Australia Contract
Duration: 6 months (with possible extension)
About the Role
We are seeking an experienced Data Engineer with strong expertise in Airflow AWS and Snowflake to support the delivery of scalable and high-performance data solutions. This role involves building robust data pipelines optimizing workflows and working closely with cross-functional teams in a fast-paced environment.
Key Responsibilities
- Design develop and maintain scalable data pipelines using Apache Airflow
- Build and manage data workflows leveraging AWS services (primarily ECS and S3)
- Develop optimize and manage data solutions in Snowflake
- Implement efficient ELT/ETL processes for large-scale data processing
- Monitor pipeline performance and troubleshoot data-related issues
- Collaborate with stakeholders to understand data requirements and deliver solutions
- Ensure data quality reliability and governance standards are maintained
Key Skills & Experience (Must-Have)
- Strong hands-on experience with Airflow (primary skill)
- Solid experience with AWS (ECS S3 primary focus)
- Proven expertise in Snowflake (data development & optimization)
- Advanced SQL skills and strong understanding of data modeling
- Experience building and managing end-to-end data pipelines
Good to Have
- Experience with Terraform (Infrastructure as Code)
- Exposure to CI/CD tools such as GitHub Actions or Buildkite
- Scripting experience in Python or similar languages
Domain Experience (Preferred)
- Prior experience working on Finance / Banking domain projects will be an added advantage
View more
View less