Role: Snowflake Data Architect
Location: Markham Ontario CA
Hire Type: Contract
Job Description:
Primary Skills
- Architect and implement advanced data solutions using Snowflake on AWS ensuring scalable secure and highperformance data environments
- Extensive experience 8 years in data architecture and engineering with a proven track record in largescale data transformation programs ideally in insurance or financial services
- Proven experience in architecting and implementing advanced data solutions using Snowflake on AWS
- Expertise in design and orchestrating data acquisition pipelines using AWS Glue for ETLELT Snowflake OpenFlow and Apache Airflow for workflow automation enabling seamless ingestion of different data from diverse sources
- Deep expertise in Snowflake with hands on experience delivering Snowflake as an enterprise capability
- Handson experience with AWS Glue for ETLELT Apache Airflow for orchestration and dbt for transformation preferably deployed on AWS ECS
- Proficiency in SQL data modelling ETLELT processes
- Proven experience in DBT to manage and automate complex data transformations within Snowflake ensuring modular testable and versioncontrolled transformation logic
- Experience in implementing the lake house solution Medallion architecture for financial or insurance carriers
- Experience in optimizing and tune Snowflake environments for performance cost and scalability including query optimization and resource management
- Experience in architecting lead migration of workloads from Cloudera to Snowflake
- Experience in evaluating the data technology platform including data governance suites data security products
- Develop robust data models and data pipelines to support data transformation integrating multiple data sources and ensuring data quality and integrity
- Document architecture data flows and transformation logic to ensure transparency maintainability and knowledge sharing across teams
- Strong knowledge of data lifecycle mgmt. data retention data modelling and working knowledge of cloud computing and modern development practices
Secondary Skills
- Familiarity with data mesh principles data product delivery and modern data warehousing paradigms
- Experience in Designing Streamlit apps and define new capabilities and data products leveraging snowflake ML and MLOPS capabilities
- SnowPro advanced certification preferred
- Knowledge of scripting languages Python Java
- Experience with data governance metadata management and data quality frameworks eg Collibra Informatica
- Experience in Insurance Domain
- Experience in converting policy data conversion from legacy to modern platform
- Exposure to enterprise Datawarehouse solution like Cloudera AWS Redshift and informatica tool sets IDMC PowerCenter BDM
Role: Snowflake Data Architect Location: Markham Ontario CA Hire Type: Contract Job Description: Primary Skills Architect and implement advanced data solutions using Snowflake on AWS ensuring scalable secure and highperformance data environments Extensive experience 8 years in data architecture a...
Role: Snowflake Data Architect
Location: Markham Ontario CA
Hire Type: Contract
Job Description:
Primary Skills
- Architect and implement advanced data solutions using Snowflake on AWS ensuring scalable secure and highperformance data environments
- Extensive experience 8 years in data architecture and engineering with a proven track record in largescale data transformation programs ideally in insurance or financial services
- Proven experience in architecting and implementing advanced data solutions using Snowflake on AWS
- Expertise in design and orchestrating data acquisition pipelines using AWS Glue for ETLELT Snowflake OpenFlow and Apache Airflow for workflow automation enabling seamless ingestion of different data from diverse sources
- Deep expertise in Snowflake with hands on experience delivering Snowflake as an enterprise capability
- Handson experience with AWS Glue for ETLELT Apache Airflow for orchestration and dbt for transformation preferably deployed on AWS ECS
- Proficiency in SQL data modelling ETLELT processes
- Proven experience in DBT to manage and automate complex data transformations within Snowflake ensuring modular testable and versioncontrolled transformation logic
- Experience in implementing the lake house solution Medallion architecture for financial or insurance carriers
- Experience in optimizing and tune Snowflake environments for performance cost and scalability including query optimization and resource management
- Experience in architecting lead migration of workloads from Cloudera to Snowflake
- Experience in evaluating the data technology platform including data governance suites data security products
- Develop robust data models and data pipelines to support data transformation integrating multiple data sources and ensuring data quality and integrity
- Document architecture data flows and transformation logic to ensure transparency maintainability and knowledge sharing across teams
- Strong knowledge of data lifecycle mgmt. data retention data modelling and working knowledge of cloud computing and modern development practices
Secondary Skills
- Familiarity with data mesh principles data product delivery and modern data warehousing paradigms
- Experience in Designing Streamlit apps and define new capabilities and data products leveraging snowflake ML and MLOPS capabilities
- SnowPro advanced certification preferred
- Knowledge of scripting languages Python Java
- Experience with data governance metadata management and data quality frameworks eg Collibra Informatica
- Experience in Insurance Domain
- Experience in converting policy data conversion from legacy to modern platform
- Exposure to enterprise Datawarehouse solution like Cloudera AWS Redshift and informatica tool sets IDMC PowerCenter BDM
View more
View less