Data Engineer (Snowflake DBT AWS)
Opportunity in the insurance industry to lead enterprise data solutions using Snowflake dbt AWS and Python. Contribute to large-scale digital transformation and AI enablement in a hybrid role based in Markham. Work on high-impact projects within a collaborative fast-paced technology environment.
What is in it for you:
Salaried: $75-85 per hour.
Incorporated Business Rate: $90-100 per hour.
12-month contract.
Full-time position: 37.50 hours per week.
Hybrid model: 3 days per week on-site subject to change.
Responsibilities:
Lead the design development and delivery of scalable data pipelines using dbt Core/Cloud and other modern tools.
Architect data ecosystems aligned with enterprise standards and business requirements.
Build maintain and optimize robust code in SQL Python Shell and Terraform.
Design and review data models (conceptual logical physical) to support business processes.
Create detailed solution design documents to guide engineering implementation.
Lead data testing strategies including development of test plans and validation processes.
Promote data governance practices and the use of data catalogs and quality frameworks.
Analyze and resolve complex technical issues across data platforms.
Facilitate scrum ceremonies and promote agile best practices within the team.
Mentor junior engineers and contribute to the continuous improvement of engineering practices.
Collaborate with cross-functional teams to deliver customer-centric data solutions.
Conduct technical presentations and provide design feedback across projects.
Plan and execute data release activities to ensure smooth deployments.
Contribute to recruitment by designing technical challenges and participating in interviews.
What you will need to succeed:
Bachelors degree in Computer Science Engineering or a related field.
10 years of experience in data engineering including leadership in enterprise-level projects.
Proven delivery of at least 5 large-scale data initiatives from design through implementation.
Advanced expertise in Snowflake PostgreSQL Amazon Aurora and Hadoop.
Strong knowledge of NoSQL databases such as MongoDB.
Proficiency in data visualization tools such as SnowSight Streamlit Qlik and SAP Business Objects.
Expert-level coding in SQL Python Shell and Terraform.
Experience with orchestration tools such as Zena and AWS Managed Airflow.
High adaptability and resilience in complex fast-paced environments.
Strong communication and collaboration skills with the ability to lead and influence.
Familiarity with insurance business processes is an asset.
Experience with operationalizing AI/ML models is considered a plus.
Preferred certifications:
SnowPRO Core
SnowPRO Advanced Data Engineer (DEA-C01 or DEA-C02)
dbt Developer
AWS Cloud Practitioner
Why Recruit Action
Recruit Action (agency permit: AP-2504511) provides recruitment services through quality support and a personalized approach to job seekers and businesses. Only candidates who match hiring criteria will be contacted.
# AVICJP