Our client represents the connected world offering innovative and customer-centric information technology experiences enabling Enterprises Associates and Society to Rise.
They are a USD 6 billion company with 163000 professionals across 90 countries helping 1279 global customers including Fortune 500 companies. They focus on leveraging next-generation technologies including 5G Blockchain Metaverse Quantum Computing Cybersecurity Artificial Intelligence and more on enabling end-to-end digital transformation for global customers.
Our client is one of the fastest-growing brands and among the top 7 IT service providers globally. Our client has consistently emerged as a leader in sustainability and is recognized amongst the 2021 Global 100 Most sustainable corporations in the World by Corporate Knights.
We are currently searching for a Data Engineer
Responsibilities
- Infrastructure as Code (IaC): Architect and maintain GCP data infrastructure using Terraform to ensure consistency across environments (Dev Stage Prod).
- Lakehouse Architecture: Design enterprise-grade storage and compute layers using BigQuery GCS and Dataproc.
- Automated Provisioning: Develop CI/CD pipelines for automated deployment of data resources (Pub/Sub topics Dataflow jobs Cloud Composer environments).
- Data Modeling: Create canonical and domain-specific data models to support AI/ML and operational products.
- Cross-Cloud Connectivity: Implement BigQuery Omni to enable federated queries across different cloud providers without moving data.
- Self-Service Enablement: Build federated data layers that allow downstream consumers to access data products autonomously.
Requirements
- GCP Data Engineer with Terraform IaC
- Architect and design an enterprise grade GCP based data lakehouse leveraging BigQuery GCS Dataproc Dataflow Pub/Sub Cloud Composer and BigQuery Omni.
- Define data ingestion hydration curation processing and enrichment strategies for large scale structured semi structured and unstructured datasets.
- Create data domain models canonical models and consumption ready datasets for analytics AI/ML and operational data products.
- Design federated data layers and self service data products for downstream consumers.
Must
Languages
- Advanced Oral English.
- Native Spanish.
Note:
- On-site Hybrid Fully remote Temporarily remote due to COVID-19 Work-from-home flexibility. <-- Select only one
If you meet these qualifications and are pursuing new challenges Start your application to join an award-winning employer. Explore all our job openings Sequoia Careers Page:
Terraform IaC
Requirements:
- GCP Data Engineer with Terraform IaC
- Architect and design an enterprise grade GCP based data lakehouse leveraging BigQuery GCS Dataproc Dataflow Pub/Sub Cloud Composer and BigQuery Omni.
- Define data ingestion hydration curation processing and enrichment strategies for large scale structured semi structured and unstructured datasets.
- Create data domain models canonical models and consumption ready datasets for analytics AI/ML and operational data products.
- Design federated data layers and self service data products for downstream consumers.
Our client represents the connected world offering innovative and customer-centric information technology experiences enabling Enterprises Associates and Society to Rise.They are a USD 6 billion company with 163000 professionals across 90 countries helping 1279 global customers including Fortune 500...
Our client represents the connected world offering innovative and customer-centric information technology experiences enabling Enterprises Associates and Society to Rise.
They are a USD 6 billion company with 163000 professionals across 90 countries helping 1279 global customers including Fortune 500 companies. They focus on leveraging next-generation technologies including 5G Blockchain Metaverse Quantum Computing Cybersecurity Artificial Intelligence and more on enabling end-to-end digital transformation for global customers.
Our client is one of the fastest-growing brands and among the top 7 IT service providers globally. Our client has consistently emerged as a leader in sustainability and is recognized amongst the 2021 Global 100 Most sustainable corporations in the World by Corporate Knights.
We are currently searching for a Data Engineer
Responsibilities
- Infrastructure as Code (IaC): Architect and maintain GCP data infrastructure using Terraform to ensure consistency across environments (Dev Stage Prod).
- Lakehouse Architecture: Design enterprise-grade storage and compute layers using BigQuery GCS and Dataproc.
- Automated Provisioning: Develop CI/CD pipelines for automated deployment of data resources (Pub/Sub topics Dataflow jobs Cloud Composer environments).
- Data Modeling: Create canonical and domain-specific data models to support AI/ML and operational products.
- Cross-Cloud Connectivity: Implement BigQuery Omni to enable federated queries across different cloud providers without moving data.
- Self-Service Enablement: Build federated data layers that allow downstream consumers to access data products autonomously.
Requirements
- GCP Data Engineer with Terraform IaC
- Architect and design an enterprise grade GCP based data lakehouse leveraging BigQuery GCS Dataproc Dataflow Pub/Sub Cloud Composer and BigQuery Omni.
- Define data ingestion hydration curation processing and enrichment strategies for large scale structured semi structured and unstructured datasets.
- Create data domain models canonical models and consumption ready datasets for analytics AI/ML and operational data products.
- Design federated data layers and self service data products for downstream consumers.
Must
Languages
- Advanced Oral English.
- Native Spanish.
Note:
- On-site Hybrid Fully remote Temporarily remote due to COVID-19 Work-from-home flexibility. <-- Select only one
If you meet these qualifications and are pursuing new challenges Start your application to join an award-winning employer. Explore all our job openings Sequoia Careers Page:
Terraform IaC
Requirements:
- GCP Data Engineer with Terraform IaC
- Architect and design an enterprise grade GCP based data lakehouse leveraging BigQuery GCS Dataproc Dataflow Pub/Sub Cloud Composer and BigQuery Omni.
- Define data ingestion hydration curation processing and enrichment strategies for large scale structured semi structured and unstructured datasets.
- Create data domain models canonical models and consumption ready datasets for analytics AI/ML and operational data products.
- Design federated data layers and self service data products for downstream consumers.
View more
View less