Key Responsibilities
- Design develop and maintain scalable data pipelines using Snowflake as the core data warehouse platform.
- Build and optimize batch file-based and real-time streaming data ingestion processes from multiple sources.
- Develop data transformation logic using Snowflake SQL stored procedures and Python.
- Collaborate with architects analysts and business teams to translate requirements into technical solutions.
- Monitor and troubleshoot pipeline performance while ensuring data quality and availability.
- Maintain documentation for data flows models and system architecture.
- Partner with DevOps teams to implement CI/CD practices for data pipeline deployments.
Required Skills and Experience
- Bachelors degree in Computer Science Information Systems or related field.
- 5 years of experience in data engineering or similar roles.
- 3 years of strong hands-on experience with Snowflake including advanced SQL stored procedures and performance tuning.
- Experience with data ingestion methods such as bulk loading micro-batching and streaming.
- Strong knowledge of AWS services such as S3 Lambda and CloudWatch or Azure services including Data Factory Event Hubs Blob Storage and Azure Functions.
- Experience working in Azure cloud environments.
- Strong Python programming skills for automation integrations and data processing.
- Good experience with SQL Server and relational databases.
- Knowledge of data modeling security governance and best practices.
- Experience with Git and collaborative development environments.
Required Skills:
Snowflake
Key ResponsibilitiesDesign develop and maintain scalable data pipelines using Snowflake as the core data warehouse platform.Build and optimize batch file-based and real-time streaming data ingestion processes from multiple sources.Develop data transformation logic using Snowflake SQL stored procedur...
Key Responsibilities
- Design develop and maintain scalable data pipelines using Snowflake as the core data warehouse platform.
- Build and optimize batch file-based and real-time streaming data ingestion processes from multiple sources.
- Develop data transformation logic using Snowflake SQL stored procedures and Python.
- Collaborate with architects analysts and business teams to translate requirements into technical solutions.
- Monitor and troubleshoot pipeline performance while ensuring data quality and availability.
- Maintain documentation for data flows models and system architecture.
- Partner with DevOps teams to implement CI/CD practices for data pipeline deployments.
Required Skills and Experience
- Bachelors degree in Computer Science Information Systems or related field.
- 5 years of experience in data engineering or similar roles.
- 3 years of strong hands-on experience with Snowflake including advanced SQL stored procedures and performance tuning.
- Experience with data ingestion methods such as bulk loading micro-batching and streaming.
- Strong knowledge of AWS services such as S3 Lambda and CloudWatch or Azure services including Data Factory Event Hubs Blob Storage and Azure Functions.
- Experience working in Azure cloud environments.
- Strong Python programming skills for automation integrations and data processing.
- Good experience with SQL Server and relational databases.
- Knowledge of data modeling security governance and best practices.
- Experience with Git and collaborative development environments.
Required Skills:
Snowflake
View more
View less