Title: Snowflake and DW Developer
Location: Reading PA (Hybrid)
Type: Contract
Job Description:
- Must have strong background in Data warehouse snowflake and airflow.
- Design implement and optimize efficient ETL processes to transform raw data into actionable insights.
- Develop and maintain robust data warehouse solutions including the implementation of star and snowflake schemas.
- Establish and manage reliable data pipelines to ensure timely data availability.
- Create modular maintainable and scalable dbt workflows for advanced data transformations.
- Leverage dbt testing documentation snapshotting and Change Data Capture (CDC) for incremental data refresh.
- Implement and manage Type 2 data modelling techniques for historical data.
- Develop reusable macros and packages using Python libraries and dbt packages.
- Optimize complex SQL queries and leverage Snowflakes performance-enhancing features like Streams Time Travel partitioning and clustering.
- Orchestrate data pipelines for both batch and near real-time data refresh scenarios.
- Write and optimize Snowflake SQL queries and stored procedures for seamless data transformation and integration.
- Ensure compliance with data governance policies and implement security controls for sensitive data.
What we need from you:
- Bachelors degree in computer science Information Systems or a related field.
- 3-5 years of experience in data warehouse development and ETL tools (e.g. dbt SSIS Informatica or Azure Data Factory).
- 1 2 years of experience with dbt and Snowflake.
- Proficiency in SQL/PL-SQL for data querying and optimization.
- Familiarity with Python for enhancing dbt pipelines.
- Strong analytical problem-solving and communication skills.
Additional knowledge and/or experience desired:
- Hands-on experience with ETL tools such as DBT SSIS Informatica or Azure Data Factory.
- Knowledge of Snowflake including query writing and data integration.
- Familiarity with cloud platforms like Azure Synapse AWS Redshift or Snowflake.
- Experience with Agile methodologies.
- Experience with CI/CD tools like Jenkins Docker or Terraform.
Title: Snowflake and DW Developer Location: Reading PA (Hybrid) Type: Contract Job Description: Must have strong background in Data warehouse snowflake and airflow. Design implement and optimize efficient ETL processes to transform raw data into actionable insights. Develop and maintain robust d...
Title: Snowflake and DW Developer
Location: Reading PA (Hybrid)
Type: Contract
Job Description:
- Must have strong background in Data warehouse snowflake and airflow.
- Design implement and optimize efficient ETL processes to transform raw data into actionable insights.
- Develop and maintain robust data warehouse solutions including the implementation of star and snowflake schemas.
- Establish and manage reliable data pipelines to ensure timely data availability.
- Create modular maintainable and scalable dbt workflows for advanced data transformations.
- Leverage dbt testing documentation snapshotting and Change Data Capture (CDC) for incremental data refresh.
- Implement and manage Type 2 data modelling techniques for historical data.
- Develop reusable macros and packages using Python libraries and dbt packages.
- Optimize complex SQL queries and leverage Snowflakes performance-enhancing features like Streams Time Travel partitioning and clustering.
- Orchestrate data pipelines for both batch and near real-time data refresh scenarios.
- Write and optimize Snowflake SQL queries and stored procedures for seamless data transformation and integration.
- Ensure compliance with data governance policies and implement security controls for sensitive data.
What we need from you:
- Bachelors degree in computer science Information Systems or a related field.
- 3-5 years of experience in data warehouse development and ETL tools (e.g. dbt SSIS Informatica or Azure Data Factory).
- 1 2 years of experience with dbt and Snowflake.
- Proficiency in SQL/PL-SQL for data querying and optimization.
- Familiarity with Python for enhancing dbt pipelines.
- Strong analytical problem-solving and communication skills.
Additional knowledge and/or experience desired:
- Hands-on experience with ETL tools such as DBT SSIS Informatica or Azure Data Factory.
- Knowledge of Snowflake including query writing and data integration.
- Familiarity with cloud platforms like Azure Synapse AWS Redshift or Snowflake.
- Experience with Agile methodologies.
- Experience with CI/CD tools like Jenkins Docker or Terraform.
View more
View less