Senior Data Engineer AWS & SnowflakeExperience: 6 to 10 Years
Location: Islamabad Lahore or Karachi (Hybrid)
Employment Type: Full-TimePosition OverviewWe are looking for a skilled Senior Data Engineer with strong AWS and Snowflake expertise to design develop and optimize scalable cloud-based data pipelines and modern data warehousing solutions.The role involves hands-on development building high-performance ELT/ETL pipelines and contributing to the architecture of a scalable cloud data platform. The ideal candidate has deep expertise in Snowflake data warehousing AWS data services and Python-based data engineering along with a strong understanding of modern data engineering best practices.Key Responsibilities- Develop and maintain scalable batch and real-time data pipelines on AWS
- Build and optimize ELT/ETL pipelines using Snowflake AWS Glue and Spark on EMR
- Design and implement Snowflake data warehouse solutions including schema design clustering and performance optimization
- Develop and manage Snowflake data models (Star/Snowflake schemas) for analytics and reporting
- Implement workflow orchestration using Apache Airflow (MWAA)
- Support serverless data processing using AWS Lambda
- Design and manage data lake architectures using Amazon S3 and Snowflake
- Implement data ingestion frameworks for structured and semi-structured data
- Optimize Snowflake query performance storage usage and compute costs
- Apply data governance security and role-based access controls using AWS DataZone and Snowflake features
- Monitor and troubleshoot data pipelines and platform issues using Amazon CloudWatch and other monitoring tools
- Collaborate with Analytics BI Data Science and Business teams to enable data-driven decision-making
- Write clean scalable and efficient code and participate in peer code reviews and architecture discussions
- Ensure data quality reliability and compliance with governance standards
Required Skills & Experience- 610 years of experience in Data Engineering or related roles
- 3 years of hands-on experience designing and building data platforms on AWS
- Strong experience with Snowflake data warehouse architecture development and optimization
- Proficiency with AWS data services including:
- AWS Glue
- Amazon EMR (Spark)
- AWS Lambda
- Apache Airflow (MWAA)
- Amazon EC2
- Amazon CloudWatch
- Amazon Redshift
- Amazon DynamoDB
- AWS DataZone
- Amazon S3
- Strong programming skills in Python for data engineering and pipeline development
- Advanced SQL skills with query optimization and performance tuning
- Strong understanding of data warehousing concepts and dimensional modeling (Star/Snowflake)
- Experience building scalable ETL/ELT pipelines and handling large-scale datasets
- Familiarity with Agile/Scrum delivery environments
- Strong problem-solving and analytical skills
Nice to Have- AWS Professional Certification (Solutions Architect / Data Analytics Specialty)
- Snowflake Certification (SnowPro Core or Advanced)
- Experience with real-time streaming technologies (Kinesis Kafka)
- Exposure to data governance data catalog and metadata management tools
- Experience with CI/CD pipelines for data engineering workflows
- Knowledge of cost optimization strategies in AWS and Snowflake
Required Experience:
Senior IC
Senior Data Engineer AWS & SnowflakeExperience: 6 to 10 YearsLocation: Islamabad Lahore or Karachi (Hybrid)Employment Type: Full-TimePosition OverviewWe are looking for a skilled Senior Data Engineer with strong AWS and Snowflake expertise to design develop and optimize scalable cloud-based data pi...
Senior Data Engineer AWS & SnowflakeExperience: 6 to 10 Years
Location: Islamabad Lahore or Karachi (Hybrid)
Employment Type: Full-TimePosition OverviewWe are looking for a skilled Senior Data Engineer with strong AWS and Snowflake expertise to design develop and optimize scalable cloud-based data pipelines and modern data warehousing solutions.The role involves hands-on development building high-performance ELT/ETL pipelines and contributing to the architecture of a scalable cloud data platform. The ideal candidate has deep expertise in Snowflake data warehousing AWS data services and Python-based data engineering along with a strong understanding of modern data engineering best practices.Key Responsibilities- Develop and maintain scalable batch and real-time data pipelines on AWS
- Build and optimize ELT/ETL pipelines using Snowflake AWS Glue and Spark on EMR
- Design and implement Snowflake data warehouse solutions including schema design clustering and performance optimization
- Develop and manage Snowflake data models (Star/Snowflake schemas) for analytics and reporting
- Implement workflow orchestration using Apache Airflow (MWAA)
- Support serverless data processing using AWS Lambda
- Design and manage data lake architectures using Amazon S3 and Snowflake
- Implement data ingestion frameworks for structured and semi-structured data
- Optimize Snowflake query performance storage usage and compute costs
- Apply data governance security and role-based access controls using AWS DataZone and Snowflake features
- Monitor and troubleshoot data pipelines and platform issues using Amazon CloudWatch and other monitoring tools
- Collaborate with Analytics BI Data Science and Business teams to enable data-driven decision-making
- Write clean scalable and efficient code and participate in peer code reviews and architecture discussions
- Ensure data quality reliability and compliance with governance standards
Required Skills & Experience- 610 years of experience in Data Engineering or related roles
- 3 years of hands-on experience designing and building data platforms on AWS
- Strong experience with Snowflake data warehouse architecture development and optimization
- Proficiency with AWS data services including:
- AWS Glue
- Amazon EMR (Spark)
- AWS Lambda
- Apache Airflow (MWAA)
- Amazon EC2
- Amazon CloudWatch
- Amazon Redshift
- Amazon DynamoDB
- AWS DataZone
- Amazon S3
- Strong programming skills in Python for data engineering and pipeline development
- Advanced SQL skills with query optimization and performance tuning
- Strong understanding of data warehousing concepts and dimensional modeling (Star/Snowflake)
- Experience building scalable ETL/ELT pipelines and handling large-scale datasets
- Familiarity with Agile/Scrum delivery environments
- Strong problem-solving and analytical skills
Nice to Have- AWS Professional Certification (Solutions Architect / Data Analytics Specialty)
- Snowflake Certification (SnowPro Core or Advanced)
- Experience with real-time streaming technologies (Kinesis Kafka)
- Exposure to data governance data catalog and metadata management tools
- Experience with CI/CD pipelines for data engineering workflows
- Knowledge of cost optimization strategies in AWS and Snowflake
Required Experience:
Senior IC
View more
View less