Job Posting:GCP Data Engineer
We are seeking an experienced GCP Data Engineer with a strong background in data engineering cloud technologies and optimization. The ideal candidate will have expertise in Google BigQuery specifically in data ingestion performance optimization and data modeling. A deep understanding of cloud services such as Cloud Data Flow Pub/Sub and data orchestration tools like Apache Airflow is essential.
Key Responsibilities:
-
Design develop and maintain efficient data pipelines and workflows in Google Cloud Platform (GCP).
-
Optimize data ingestion processes and enhance performance in BigQuery.
-
Implement and manage real-time data streaming using Pub/Sub and Cloud Data Flow.
-
Develop and automate workflows using Apache Airflow.
-
Collaborate with cross-functional teams to design scalable and reliable cloud solutions.
-
Utilize Python and other tools for scripting and automation of data processes.
-
Apply DevOps principles for continuous integration and deployment (CI/CD) using GitHub and related tools.
Requirements:
-
Proven expertise in Google BigQuery including data modeling and performance optimization.
-
Strong experience with Cloud Data Flow Pub/Sub and Apache Airflow.
-
Proficient in Python for data engineering tasks.
-
Familiarity with DevOps practices and CI/CD pipelines.
-
Experience using GitHub for version control and automation.
-
Strong problem-solving and communication skills with the ability to work in a fast-paced collaborative environment.
Job Posting:GCP Data Engineer We are seeking an experienced GCP Data Engineer with a strong background in data engineering cloud technologies and optimization. The ideal candidate will have expertise in Google BigQuery specifically in data ingestion performance optimization and data modeling. A deep...
Job Posting:GCP Data Engineer
We are seeking an experienced GCP Data Engineer with a strong background in data engineering cloud technologies and optimization. The ideal candidate will have expertise in Google BigQuery specifically in data ingestion performance optimization and data modeling. A deep understanding of cloud services such as Cloud Data Flow Pub/Sub and data orchestration tools like Apache Airflow is essential.
Key Responsibilities:
-
Design develop and maintain efficient data pipelines and workflows in Google Cloud Platform (GCP).
-
Optimize data ingestion processes and enhance performance in BigQuery.
-
Implement and manage real-time data streaming using Pub/Sub and Cloud Data Flow.
-
Develop and automate workflows using Apache Airflow.
-
Collaborate with cross-functional teams to design scalable and reliable cloud solutions.
-
Utilize Python and other tools for scripting and automation of data processes.
-
Apply DevOps principles for continuous integration and deployment (CI/CD) using GitHub and related tools.
Requirements:
-
Proven expertise in Google BigQuery including data modeling and performance optimization.
-
Strong experience with Cloud Data Flow Pub/Sub and Apache Airflow.
-
Proficient in Python for data engineering tasks.
-
Familiarity with DevOps practices and CI/CD pipelines.
-
Experience using GitHub for version control and automation.
-
Strong problem-solving and communication skills with the ability to work in a fast-paced collaborative environment.
View more
View less