Position: GCP Data Engineer
Location: Richardson TX
Duration: Long term contract
Interview: F2F (Face to Face)/Onsite
Job Summary:
We are seeking an experienced Google Cloud Platform (GCP) Data Engineer to design build and optimize data pipelines and analytics solutions. The ideal candidate will have hands-on expertise with GCP services ETL/ELT processes and big data technologies enabling the delivery of scalable and high-performance data solutions.
Key Responsibilities:
- Design develop and maintain data pipelines and ETL/ELT workflows using GCP services such as BigQuery Dataflow Pub/Sub Data Fusion Dataproc and Cloud Storage.
- Build and optimize data warehouses and data lakes on GCP.
- Collaborate with data scientists analysts and business stakeholders to deliver data models that meet business needs.
- Implement data quality governance and security best practices.
- Monitor and troubleshoot pipeline performance using Cloud Monitoring and Cloud Logging.
- Automate data workflows and improve efficiency using Python SQL and scripting tools.
- Stay updated on emerging GCP services and recommend adoption where beneficial.
Required Skills & Qualifications:
- Bachelors degree in Computer Science Data Engineering or related field.
- 5 years of experience as a Data Engineer with hands-on GCP experience.
- Strong skills in SQL and Python.
- Experience with BigQuery and at least two of the following: Dataflow Pub/Sub Data Fusion Dataproc Cloud Composer.
- Knowledge of data modeling data warehousing and big data architectures.
- Experience with batch and streaming data processing.
- Understanding of IAM data security and compliance requirements.
Preferred Qualifications:
- GCP Professional Data Engineer or Associate Cloud Engineer certification.
- Experience with Apache Beam Kafka or similar streaming frameworks.
- Familiarity with CI/CD pipelines for data engineering workflows.
- Exposure to machine learning data preparation pipelines.