Role: Sr GCP Data Engineer (Google Cloud Platform)
Location: Remote - India
Job description:
Develop construct test and maintain data acquisition pipelines for large volumes of structed and unstructured data. This includes batch and real-time processing (in google cloud)
Build large and complex datasets based on business requirements
Construct big data pipeline architecture
Identify opportunities for data acquisition via working with stakeholders and business clients
Translate business needs to technical requirements
Leverage a variety of tools in the Google Cloud Ecosystem such as Python Data Flow DataStream CDC (Change Data Capture) Cloud Functions Cloud Run Pub Sub BigQuery Cloud Storage to integrate systems and data pipelines
Use logs & alerts to effectively monitor pipelines
Use SAP SLT to replicate SAP tables to Google Cloud using SLT
Develop JSON messaging structure for integrating with various applications
Leverage DevOps and CI/CD practices (GitHub Terraform) to ensure the reliability and scalability of data pipelines
Partition/Cluster and retrieve content in Big Query and use IAM roles & Policy Tags to secure the data
Use roles to secure access to datasets authorized views to share data between projects
Design and build an ingestion pipeline using Rest API
Recommends ways to improve data quality reliability and efficiency
Experience Requires:
5 years of experience in a GCP data engineering role (offshore)
Atleast 1 year of hands on experience in GCP tools highlighted(Python SQL Data Flow DataStream CDC (Change Data Capture) Cloud Functions Cloud Run Pub Sub BigQuery Cloud Storage)
Role: Sr GCP Data Engineer (Google Cloud Platform) Location: Remote - India Job description: Develop construct test and maintain data acquisition pipelines for large volumes of structed and unstructured data. This includes batch and real-time processing (in google cloud) Build large and c...
Role: Sr GCP Data Engineer (Google Cloud Platform)
Location: Remote - India
Job description:
Develop construct test and maintain data acquisition pipelines for large volumes of structed and unstructured data. This includes batch and real-time processing (in google cloud)
Build large and complex datasets based on business requirements
Construct big data pipeline architecture
Identify opportunities for data acquisition via working with stakeholders and business clients
Translate business needs to technical requirements
Leverage a variety of tools in the Google Cloud Ecosystem such as Python Data Flow DataStream CDC (Change Data Capture) Cloud Functions Cloud Run Pub Sub BigQuery Cloud Storage to integrate systems and data pipelines
Use logs & alerts to effectively monitor pipelines
Use SAP SLT to replicate SAP tables to Google Cloud using SLT
Develop JSON messaging structure for integrating with various applications
Leverage DevOps and CI/CD practices (GitHub Terraform) to ensure the reliability and scalability of data pipelines
Partition/Cluster and retrieve content in Big Query and use IAM roles & Policy Tags to secure the data
Use roles to secure access to datasets authorized views to share data between projects
Design and build an ingestion pipeline using Rest API
Recommends ways to improve data quality reliability and efficiency
Experience Requires:
5 years of experience in a GCP data engineering role (offshore)
Atleast 1 year of hands on experience in GCP tools highlighted(Python SQL Data Flow DataStream CDC (Change Data Capture) Cloud Functions Cloud Run Pub Sub BigQuery Cloud Storage)
View more
View less