Role : Senior GCP Data Engineer
Location: Houston TX(Onsite)
Duration: 6 Months
Only locals
Job Description:
- Looking for 10 years of GCP Data engineer.
- As a Senior Data Engineer youll be the architect and builder of our data backbone.
- we expect you to not just write code but to design scalable resilient systems that turn raw data into strategic assets. Youll be working in a high-velocity environment where Google Cloud Platform (GCP) is our primary playground.
- Design and maintain complex ETL/ELT pipelines using Cloud DataProc for large-scale processing.
- Implement robust messaging and streaming architectures using Cloud Pub/Sub.
- Develop lightweight event-driven microservices and data triggers using Cloud Functions and Cloud Run.
- Architect and optimize storage solutions on Google Cloud Storage (GCS) ensuring data security and cost-efficiency.
- Utilize Pig Query (Apache Pig on DataProc) for analyzing and transforming massive datasets.
- Write clean production-grade Python code for data manipulation API integrations and automation scripts.
- Big Data Processing Expert-level experience with Cloud DataProc and Pig Query.
- Cloud Storage deep understanding of GCS bucket policies lifecycles and storage classes.
- Strong experience with Pub/Sub for decoupled systems.
- Proficiency in deploying containerized apps via Cloud Run and snippets via Cloud Functions.
- Programming Advanced Python (including libraries like Pandas PySpark or Beam).
- Data Design Experience in designing schemas and managing data lifecycles.
- Experience with Infrastructure as Code (Terraform).
Best Regards:
Peerbhi SK
Phone: 1-
Email:
Role : Senior GCP Data Engineer Location: Houston TX(Onsite) Duration: 6 Months Only locals Job Description: Looking for 10 years of GCP Data engineer. As a Senior Data Engineer youll be the architect and builder of our data backbone. we expect you to not just write code but to design scalable res...
Role : Senior GCP Data Engineer
Location: Houston TX(Onsite)
Duration: 6 Months
Only locals
Job Description:
- Looking for 10 years of GCP Data engineer.
- As a Senior Data Engineer youll be the architect and builder of our data backbone.
- we expect you to not just write code but to design scalable resilient systems that turn raw data into strategic assets. Youll be working in a high-velocity environment where Google Cloud Platform (GCP) is our primary playground.
- Design and maintain complex ETL/ELT pipelines using Cloud DataProc for large-scale processing.
- Implement robust messaging and streaming architectures using Cloud Pub/Sub.
- Develop lightweight event-driven microservices and data triggers using Cloud Functions and Cloud Run.
- Architect and optimize storage solutions on Google Cloud Storage (GCS) ensuring data security and cost-efficiency.
- Utilize Pig Query (Apache Pig on DataProc) for analyzing and transforming massive datasets.
- Write clean production-grade Python code for data manipulation API integrations and automation scripts.
- Big Data Processing Expert-level experience with Cloud DataProc and Pig Query.
- Cloud Storage deep understanding of GCS bucket policies lifecycles and storage classes.
- Strong experience with Pub/Sub for decoupled systems.
- Proficiency in deploying containerized apps via Cloud Run and snippets via Cloud Functions.
- Programming Advanced Python (including libraries like Pandas PySpark or Beam).
- Data Design Experience in designing schemas and managing data lifecycles.
- Experience with Infrastructure as Code (Terraform).
Best Regards:
Peerbhi SK
Phone: 1-
Email:
View more
View less