Job Description:
-
Build and maintain scalable data pipelines using PySpark/Spark.
-
Design and orchestrate workflows with Apache Airflow.
-
Develop efficient data models and queries in BigQuery.
-
Work with GCP Big Data tools to integrate and process large-scale datasets.
-
Ensure data quality performance and security.
Job Description: Build and maintain scalable data pipelines using PySpark/Spark. Design and orchestrate workflows with Apache Airflow. Develop efficient data models and queries in BigQuery. Work with GCP Big Data tools to integrate and process large-scale datasets. Ensure data quality p...
Job Description:
-
Build and maintain scalable data pipelines using PySpark/Spark.
-
Design and orchestrate workflows with Apache Airflow.
-
Develop efficient data models and queries in BigQuery.
-
Work with GCP Big Data tools to integrate and process large-scale datasets.
-
Ensure data quality performance and security.
View more
View less