Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.
We currently have a multiple re-extendable long term contract roles as a GCP Data Lead - Hybrid with our global SAP consulting client working onsite at large CPG/Entertainment clients office 2-3 days a week in dwontown Toronto. This is a 6 month contract initially but will be extended long term.
Role: GCP Data Lead - Hybrid
Type: Contract
# of Roles: 1
Duration: 9-12 months to start possible extensions
Location: Toronto ON - Hybrid 2-3 days onsite in downtown Toronto
Rate: Open based on expectations as a C2C contractor
Role Overview
We are seeking a highly skilled Google Cloud Platform (GCP) Data Lead with strong SAP data integration expertise to design implement and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms data governance security and data modeling with hands-on experience in ETL/ELT pipelines SAP data extraction system migrations and analytics. This role will collaborate with business stakeholders and engineering teams to create a robust scalable and cost-effective data ecosystem that bridges SAP and GCP environments.
Key Responsibilities
- Mentoring
- Provide technical direction code reviews and support for complex data solutions.
- Lead and mentor a team of data engineers in building ETL/ELT pipelines for SAP and other ERP sources into GCP
- Set engineering standards best practices and coding guidelines.
- Collaborate with project managers provide the estimates track the progress remove roadblocks to ensure timely completion of work.
- Collaborate with BI teams Data analyst to enable reporting solution.
2. Data Architecture & Modeling
- Design conceptual logical and physical data models to support analytics and operational workloads.
- Implement star snowflake and data vault models for analytical systems.
3. Google Cloud Platform Expertise
- Design data solutions on GCP using BigQuery Cloud Storage Dataflow and Dataproc.
- Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
- Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer) and Dataflow
- Integrate data from multiple systems including SAP BW SAP HANA Business Objects using tools like SAP SLT or Google Cortex Framework.
- Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
- Develop complex SQL queries for analytics transformations and performance tuning.
- Build automation scripts and utilities in Python.
6. System Migration
- Lead on-premise to cloud migrations for enterprise data platforms SAP BW/Bobj
- Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
- Implement CI/CD pipelines for data workflows using GitHub Actions Cloud Build and Terraform.
- Apply infrastructure-as-code principles for reproducible and scalable deployments.
Preferred Skills
- 8 years of proven experience with GCP BigQuery Composer Cloud Storage Pub/Sub Dataflow.
- Min 2 3 years of leadership experience in mentoring small to mid size data engineering team.
- Strong SQL and Python programming skills.
- Hands-on experience with SAP data extraction modeling and integration from ERP BW and/or HANA systems.
- Knowledge of data governance frameworks and security best practices.
- Familiarity with DevOps tools for data.
- Understanding of Google Cortex Framework for SAP-GCP integrations.
Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe. We currently have a multiple re-extendable long term contract roles as a GCP Data Lead - Hybrid with our global SAP ...
Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.
We currently have a multiple re-extendable long term contract roles as a GCP Data Lead - Hybrid with our global SAP consulting client working onsite at large CPG/Entertainment clients office 2-3 days a week in dwontown Toronto. This is a 6 month contract initially but will be extended long term.
Role: GCP Data Lead - Hybrid
Type: Contract
# of Roles: 1
Duration: 9-12 months to start possible extensions
Location: Toronto ON - Hybrid 2-3 days onsite in downtown Toronto
Rate: Open based on expectations as a C2C contractor
Role Overview
We are seeking a highly skilled Google Cloud Platform (GCP) Data Lead with strong SAP data integration expertise to design implement and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms data governance security and data modeling with hands-on experience in ETL/ELT pipelines SAP data extraction system migrations and analytics. This role will collaborate with business stakeholders and engineering teams to create a robust scalable and cost-effective data ecosystem that bridges SAP and GCP environments.
Key Responsibilities
- Mentoring
- Provide technical direction code reviews and support for complex data solutions.
- Lead and mentor a team of data engineers in building ETL/ELT pipelines for SAP and other ERP sources into GCP
- Set engineering standards best practices and coding guidelines.
- Collaborate with project managers provide the estimates track the progress remove roadblocks to ensure timely completion of work.
- Collaborate with BI teams Data analyst to enable reporting solution.
2. Data Architecture & Modeling
- Design conceptual logical and physical data models to support analytics and operational workloads.
- Implement star snowflake and data vault models for analytical systems.
3. Google Cloud Platform Expertise
- Design data solutions on GCP using BigQuery Cloud Storage Dataflow and Dataproc.
- Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
- Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer) and Dataflow
- Integrate data from multiple systems including SAP BW SAP HANA Business Objects using tools like SAP SLT or Google Cortex Framework.
- Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
- Develop complex SQL queries for analytics transformations and performance tuning.
- Build automation scripts and utilities in Python.
6. System Migration
- Lead on-premise to cloud migrations for enterprise data platforms SAP BW/Bobj
- Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
- Implement CI/CD pipelines for data workflows using GitHub Actions Cloud Build and Terraform.
- Apply infrastructure-as-code principles for reproducible and scalable deployments.
Preferred Skills
- 8 years of proven experience with GCP BigQuery Composer Cloud Storage Pub/Sub Dataflow.
- Min 2 3 years of leadership experience in mentoring small to mid size data engineering team.
- Strong SQL and Python programming skills.
- Hands-on experience with SAP data extraction modeling and integration from ERP BW and/or HANA systems.
- Knowledge of data governance frameworks and security best practices.
- Familiarity with DevOps tools for data.
- Understanding of Google Cortex Framework for SAP-GCP integrations.
View more
View less