Role: Data Integration Engineer (Snowflake SSIS ETL Databricks)
Location: Mississauga ON
Type: Long term contract
Key Responsibilities:
- Develop and maintain ETL/ELT pipelines and data integrations.
- Build and optimize Snowflake data models and pipelines.
- Create and manage SSIS packages for data movement and transformations.
- Use Databricks for scalable data processing and automation.
- Implement and support Data Warehouse solutions.
- Ensure data quality governance and performance across the pipeline.
- Troubleshoot optimize and document data workflows.
Required Qualifications:
- 8 years experience as a Data Engineer.
- Strong hands-on experience with Snowflake SSIS ETL and Databricks.
- Solid understanding of DWH concepts (fact/dimension modeling schemas governance).
- Advanced SQL skills.
- Experience with cloud platforms (Azure/AWS/GCP) is an asset.
Nice-to-Have
- Python for scripting and automation.
- Experience with CI/CD dbt Airflow or other orchestration tools.
- Familiarity with BI tools (Power BI Tableau).
Thanks
Sanjay Kumar
Role: Data Integration Engineer (Snowflake SSIS ETL Databricks) Location: Mississauga ON Type: Long term contract Key Responsibilities: Develop and maintain ETL/ELT pipelines and data integrations. Build and optimize Snowflake data models and pipelines. Create and manage SSIS packages for data m...
Role: Data Integration Engineer (Snowflake SSIS ETL Databricks)
Location: Mississauga ON
Type: Long term contract
Key Responsibilities:
- Develop and maintain ETL/ELT pipelines and data integrations.
- Build and optimize Snowflake data models and pipelines.
- Create and manage SSIS packages for data movement and transformations.
- Use Databricks for scalable data processing and automation.
- Implement and support Data Warehouse solutions.
- Ensure data quality governance and performance across the pipeline.
- Troubleshoot optimize and document data workflows.
Required Qualifications:
- 8 years experience as a Data Engineer.
- Strong hands-on experience with Snowflake SSIS ETL and Databricks.
- Solid understanding of DWH concepts (fact/dimension modeling schemas governance).
- Advanced SQL skills.
- Experience with cloud platforms (Azure/AWS/GCP) is an asset.
Nice-to-Have
- Python for scripting and automation.
- Experience with CI/CD dbt Airflow or other orchestration tools.
- Familiarity with BI tools (Power BI Tableau).
Thanks
Sanjay Kumar
View more
View less