Overview
We are seeking a highly skilled Snowflake Developer to design develop and optimize scalable data solutions using the Snowflake platform. The ideal candidate will have deep expertise in data modeling ETL design query optimization and DevOps integration with a strong understanding of cloud-based data engineering best practices.
This role requires a blend of technical proficiency analytical thinking and collaboration with cross-functional teams to deliver high-quality efficient and secure data solutions that enable effective business decision-making.
Key Responsibilities Data Engineering & ETL Development
- Design develop and optimize data models and ETL processes for efficient data storage transformation and analytics in Snowflake.
- Build and maintain end-to-end ETL pipelines integrating data from multiple sources into Snowflake.
- Leverage Snowflake features (Tasks Streams Snowpipe) to automate and orchestrate continuous and batch data loads.
- Implement complex data transformation logic using SQL Python and Snowflake Stored Procedures ensuring data accuracy consistency and integrity.
- Optimize data loads queries and storage for performance and cost efficiency using micro-partitioning clustering and caching strategies.
- Collaborate with data architects and business stakeholders to gather data requirements and translate them into scalable technical solutions.
- Ensure robust data quality governance and security across Snowflake environments.
- Document data processes architectures and solutions for maintainability and audit readiness.
- Troubleshoot and resolve data pipeline or Snowflake-related performance issues.
- Stay current with Snowflake features enhancements and best practices to continuously improve data architecture and design.
Migration & Integration
- Lead and support data migration initiatives particularly from SQL Server to Snowflake.
- Integrate Snowflake with external data sources and APIs to enable seamless data exchange and processing.
- Work with cloud-based data storage and compute services (e.g. AWS S3 Azure Blob Storage) for ingestion and processing.
DevOps Integration (Secondary)
- Implement CI/CD pipelines for Snowflake using Azure DevOps including:
- Source control management
- Automated deployments across environments
- ETL pipeline versioning and release management
- Collaborate with DevOps teams to deploy and manage Snowflake scripts and ETL tool services efficiently.
- Apply DevOps best practices to data engineering workflows ensuring consistency reliability and traceability.
Qualifications & Experience
- Proven hands-on experience as a Snowflake Developer delivering data warehousing and analytics solutions.
- Strong understanding of Snowflake architecture and features including micro-partitioning data sharing and performance tuning.
- Advanced proficiency in SQL and stored procedures (Snowflake SQL and JavaScript).
- Experience with Python for automation and data processing.
- Demonstrated experience in data migration from SQL Server to Snowflake.
- Familiarity with ETL design and orchestration tools (DBT SnapLogic Talend Informatica preferred).
- Working knowledge of DevOps tools (Azure DevOps Terraform CloudFormation) for CI/CD automation and deployment.
- Excellent problem-solving analytical and communication skills with the ability to collaborate effectively across teams.
Preferred Skills
- Experience with Snowpark (Python/Scala) for developing scalable data pipelines and advanced analytics workloads.
- Exposure to AWS or Azure cloud ecosystems including services such as Lambda SQS SNS and Azure Functions.
- Experience integrating legacy Microsoft SQL platforms with modern cloud data architectures.
- Strong understanding of data governance performance tuning and optimization strategies.
- Continuous learning mindset with the ability to adapt to emerging tools and technologies.
OverviewWe are seeking a highly skilled Snowflake Developer to design develop and optimize scalable data solutions using the Snowflake platform. The ideal candidate will have deep expertise in data modeling ETL design query optimization and DevOps integration with a strong understanding of cloud-bas...
Overview
We are seeking a highly skilled Snowflake Developer to design develop and optimize scalable data solutions using the Snowflake platform. The ideal candidate will have deep expertise in data modeling ETL design query optimization and DevOps integration with a strong understanding of cloud-based data engineering best practices.
This role requires a blend of technical proficiency analytical thinking and collaboration with cross-functional teams to deliver high-quality efficient and secure data solutions that enable effective business decision-making.
Key Responsibilities Data Engineering & ETL Development
- Design develop and optimize data models and ETL processes for efficient data storage transformation and analytics in Snowflake.
- Build and maintain end-to-end ETL pipelines integrating data from multiple sources into Snowflake.
- Leverage Snowflake features (Tasks Streams Snowpipe) to automate and orchestrate continuous and batch data loads.
- Implement complex data transformation logic using SQL Python and Snowflake Stored Procedures ensuring data accuracy consistency and integrity.
- Optimize data loads queries and storage for performance and cost efficiency using micro-partitioning clustering and caching strategies.
- Collaborate with data architects and business stakeholders to gather data requirements and translate them into scalable technical solutions.
- Ensure robust data quality governance and security across Snowflake environments.
- Document data processes architectures and solutions for maintainability and audit readiness.
- Troubleshoot and resolve data pipeline or Snowflake-related performance issues.
- Stay current with Snowflake features enhancements and best practices to continuously improve data architecture and design.
Migration & Integration
- Lead and support data migration initiatives particularly from SQL Server to Snowflake.
- Integrate Snowflake with external data sources and APIs to enable seamless data exchange and processing.
- Work with cloud-based data storage and compute services (e.g. AWS S3 Azure Blob Storage) for ingestion and processing.
DevOps Integration (Secondary)
- Implement CI/CD pipelines for Snowflake using Azure DevOps including:
- Source control management
- Automated deployments across environments
- ETL pipeline versioning and release management
- Collaborate with DevOps teams to deploy and manage Snowflake scripts and ETL tool services efficiently.
- Apply DevOps best practices to data engineering workflows ensuring consistency reliability and traceability.
Qualifications & Experience
- Proven hands-on experience as a Snowflake Developer delivering data warehousing and analytics solutions.
- Strong understanding of Snowflake architecture and features including micro-partitioning data sharing and performance tuning.
- Advanced proficiency in SQL and stored procedures (Snowflake SQL and JavaScript).
- Experience with Python for automation and data processing.
- Demonstrated experience in data migration from SQL Server to Snowflake.
- Familiarity with ETL design and orchestration tools (DBT SnapLogic Talend Informatica preferred).
- Working knowledge of DevOps tools (Azure DevOps Terraform CloudFormation) for CI/CD automation and deployment.
- Excellent problem-solving analytical and communication skills with the ability to collaborate effectively across teams.
Preferred Skills
- Experience with Snowpark (Python/Scala) for developing scalable data pipelines and advanced analytics workloads.
- Exposure to AWS or Azure cloud ecosystems including services such as Lambda SQS SNS and Azure Functions.
- Experience integrating legacy Microsoft SQL platforms with modern cloud data architectures.
- Strong understanding of data governance performance tuning and optimization strategies.
- Continuous learning mindset with the ability to adapt to emerging tools and technologies.
View more
View less