Job Summary:
We are seeking a Snowflake Engineer to design build and optimize scalable data solutions using the Snowflake Data Cloud. The ideal candidate will have strong expertise in data modeling ELT/ETL pipelines SQL performance tuning and cloud integration enabling robust and efficient data processing for analytics and business insights.
Key Responsibilities:
-
Design develop and maintain data pipelines and data models in Snowflake.
-
Implement data ingestion transformation and integration processes from multiple structured and unstructured data sources.
-
Optimize Snowflake performance through clustering partitioning caching and query tuning.
-
Build secure and efficient data sharing and data governance frameworks within Snowflake.
-
Collaborate with analytics BI and data science teams to deliver high-quality datasets and data marts.
-
Integrate Snowflake with ETL/ELT tools (Informatica dbt Matillion Talend etc.) and orchestration tools (Airflow Azure Data Factory AWS Glue).
-
Implement role-based access control (RBAC) data masking and other security mechanisms.
-
Automate monitoring and management of data pipelines and ensure data quality and reliability.
-
Stay updated with Snowflakes latest features releases and best practices.
Technical Skills:
-
Core Expertise: Snowflake Data Warehouse (architecture performance tuning best practices)
-
Data Modeling: Dimensional modeling (Star/Snowflake schema) normalization/denormalization
-
SQL Expertise: Advanced SQL scripting query optimization stored procedures and UDFs
-
ETL/ELT Tools: dbt Informatica Talend Matillion Apache Airflow Azure Data Factory or AWS Glue
-
Programming Languages: Python SQL or Java (for automation and integrations)
-
Cloud Platforms: AWS Azure or GCP (with Snowflake integration)
-
Version Control: Git CI/CD pipelines (Jenkins GitLab CI/CD)
-
Optional (Nice-to-Have):
-
Knowledge of Snowpark Streamlit or Snowflake Native Apps
-
Experience with data lakes Kafka or real-time data streaming
-
Familiarity with Power BI Tableau or Looker
Soft Skills:
-
Strong analytical and problem-solving skills
-
Ability to work collaboratively with cross-functional teams
-
Strong communication and documentation skills
-
Detail-oriented with focus on data accuracy and reliability
Job Summary: We are seeking a Snowflake Engineer to design build and optimize scalable data solutions using the Snowflake Data Cloud. The ideal candidate will have strong expertise in data modeling ELT/ETL pipelines SQL performance tuning and cloud integration enabling robust and efficient data proc...
Job Summary:
We are seeking a Snowflake Engineer to design build and optimize scalable data solutions using the Snowflake Data Cloud. The ideal candidate will have strong expertise in data modeling ELT/ETL pipelines SQL performance tuning and cloud integration enabling robust and efficient data processing for analytics and business insights.
Key Responsibilities:
-
Design develop and maintain data pipelines and data models in Snowflake.
-
Implement data ingestion transformation and integration processes from multiple structured and unstructured data sources.
-
Optimize Snowflake performance through clustering partitioning caching and query tuning.
-
Build secure and efficient data sharing and data governance frameworks within Snowflake.
-
Collaborate with analytics BI and data science teams to deliver high-quality datasets and data marts.
-
Integrate Snowflake with ETL/ELT tools (Informatica dbt Matillion Talend etc.) and orchestration tools (Airflow Azure Data Factory AWS Glue).
-
Implement role-based access control (RBAC) data masking and other security mechanisms.
-
Automate monitoring and management of data pipelines and ensure data quality and reliability.
-
Stay updated with Snowflakes latest features releases and best practices.
Technical Skills:
-
Core Expertise: Snowflake Data Warehouse (architecture performance tuning best practices)
-
Data Modeling: Dimensional modeling (Star/Snowflake schema) normalization/denormalization
-
SQL Expertise: Advanced SQL scripting query optimization stored procedures and UDFs
-
ETL/ELT Tools: dbt Informatica Talend Matillion Apache Airflow Azure Data Factory or AWS Glue
-
Programming Languages: Python SQL or Java (for automation and integrations)
-
Cloud Platforms: AWS Azure or GCP (with Snowflake integration)
-
Version Control: Git CI/CD pipelines (Jenkins GitLab CI/CD)
-
Optional (Nice-to-Have):
-
Knowledge of Snowpark Streamlit or Snowflake Native Apps
-
Experience with data lakes Kafka or real-time data streaming
-
Familiarity with Power BI Tableau or Looker
Soft Skills:
-
Strong analytical and problem-solving skills
-
Ability to work collaboratively with cross-functional teams
-
Strong communication and documentation skills
-
Detail-oriented with focus on data accuracy and reliability
View more
View less