This is a remote position.
We are seeking an experienced Cloud Data Engineer to design build and optimize modern data platforms using Snowflake and/or Data-bricks. This role focuses on developing scalable data pipelines implementing reliable transformation workflows and supporting advanced analytics across cloud environments.
You will work closely with analytics BI and engineering teams to ensure high-quality performant and secure data systems. The ideal candidate has strong SQL and Python skills and experience building production-grade data infrastructure in AWS Azure or GCP.
Key Responsibilities:
Design and build scalable data pipelines (ETL/ELT)
Develop and optimize transformations in Snowflake or Data-bricks
Implement data modeling strategies (star schema lake-house etc.)
Improve query performance and cost efficiency
Integrate orchestration tools (Airflow or similar)
Support analytics and reporting teams with reliable datasets
Ensure data quality governance and monitoring
Collaborate with cross-functional teams on data initiatives
Requirements
4 years of experience in Data Engineering
Strong SQL and Python skills
Hands-on experience with Snowflake and/or Databricks
Experience with Spark (batch and/or streaming)
Experience building ETL/ELT pipelines
Familiarity with orchestration tools (Airflow or similar)
Experience working in AWS Azure or GCP environments
Strong understanding of data modeling concepts
Preferred Qualifications:
Experience with dbt or similar transformation tools
Experience with real-time data streaming (Kafka Kinesis Pub/Sub)
Experience implementing data quality frameworks
Knowledge of BI tools and downstream analytics use cases
Required Skills:
Experience with AWS Azure or GCP Strong knowledge of networking security and cloud services Experience with Terraform ARM or CloudFormation Familiarity with containers and orchestration tools Strong troubleshooting and optimization skills
This is a remote position. We are seeking an experienced Cloud Data Engineer to design build and optimize modern data platforms using Snowflake and/or Data-bricks. This role focuses on developing scalable data pipelines implementing reliable transformation workflows and supporting advanced analy...
This is a remote position.
We are seeking an experienced Cloud Data Engineer to design build and optimize modern data platforms using Snowflake and/or Data-bricks. This role focuses on developing scalable data pipelines implementing reliable transformation workflows and supporting advanced analytics across cloud environments.
You will work closely with analytics BI and engineering teams to ensure high-quality performant and secure data systems. The ideal candidate has strong SQL and Python skills and experience building production-grade data infrastructure in AWS Azure or GCP.
Key Responsibilities:
Design and build scalable data pipelines (ETL/ELT)
Develop and optimize transformations in Snowflake or Data-bricks
Implement data modeling strategies (star schema lake-house etc.)
Improve query performance and cost efficiency
Integrate orchestration tools (Airflow or similar)
Support analytics and reporting teams with reliable datasets
Ensure data quality governance and monitoring
Collaborate with cross-functional teams on data initiatives
Requirements
4 years of experience in Data Engineering
Strong SQL and Python skills
Hands-on experience with Snowflake and/or Databricks
Experience with Spark (batch and/or streaming)
Experience building ETL/ELT pipelines
Familiarity with orchestration tools (Airflow or similar)
Experience working in AWS Azure or GCP environments
Strong understanding of data modeling concepts
Preferred Qualifications:
Experience with dbt or similar transformation tools
Experience with real-time data streaming (Kafka Kinesis Pub/Sub)
Experience implementing data quality frameworks
Knowledge of BI tools and downstream analytics use cases
Required Skills:
Experience with AWS Azure or GCP Strong knowledge of networking security and cloud services Experience with Terraform ARM or CloudFormation Familiarity with containers and orchestration tools Strong troubleshooting and optimization skills