Big Data Engineer (GCP)

OmegaHires

Not Interested
Bookmark
Report This Job

profile Job Location:

Phoenix, NM - USA

profile Hourly Salary: USD 55 - 60
Posted on: 5 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Title: Big Data Engineer (GCP)
Location: Phoenix AZ (Onsite/Hybrid)
Duration:

Job Summary

We are seeking an experienced Big Data Engineer with strong expertise in Google Cloud Platform (GCP) to design build and optimize scalable data pipelines and analytics solutions. The ideal candidate will have hands-on experience with BigQuery and GCP data services and will collaborate closely with data scientists architects and business stakeholders to deliver high-performance reliable data systems.

Key Responsibilities

Data Engineering & Pipeline Development

  • Design develop and maintain scalable data pipelines using GCP services.
  • Build efficient ETL/ELT processes for structured and unstructured data.
  • Ensure data quality integrity and availability across systems.

GCP & Big Data Technologies

  • Work extensively with BigQuery Dataflow and Dataproc for data processing and analytics.
  • Optimize BigQuery queries for performance and cost efficiency.
  • Leverage GCP-native tools for scalable and resilient data architectures.

Programming & Processing

  • Develop data processing solutions using Python Java or Scala.
  • Implement batch and real-time data processing frameworks.

Workflow Orchestration & Automation

  • Design and manage workflows using Airflow or Cloud Composer.
  • Automate data pipelines and integrate with CI/CD processes.

Collaboration & Delivery

  • Partner with data scientists analysts and business teams to understand requirements.
  • Participate in Agile ceremonies and contribute to sprint deliverables.
  • Ensure timely delivery of high-quality data solutions.

Required Qualifications

  • 7 years of experience in Big Data Engineering.
  • Strong hands-on experience with GCP services (BigQuery Dataflow Dataproc).
  • Proficiency in Python Java or Scala for data engineering.
  • Strong SQL skills with experience in query optimization.
  • Experience with workflow orchestration tools (Airflow/Composer).
  • Familiarity with Agile methodologies and CI/CD practices.
  • Strong problem-solving and analytical skills.

Nice to Have

  • Experience with real-time streaming (Pub/Sub Kafka).
  • Knowledge of data warehousing and data lake architectures.
  • Exposure to data governance and security best practices.

Required Experience:

Senior IC

Job Title: Big Data Engineer (GCP)Location: Phoenix AZ (Onsite/Hybrid)Duration:Job SummaryWe are seeking an experienced Big Data Engineer with strong expertise in Google Cloud Platform (GCP) to design build and optimize scalable data pipelines and analytics solutions. The ideal candidate will have h...
View more view more