GCP Data Engineer

Recutify Inc.

Not Interested
Bookmark
Report This Job

profile Job Location:

Paramus, NJ - USA

profile Monthly Salary: Not Disclosed
Posted on: 1 hour ago
Vacancies: 1 Vacancy

Job Summary

GCP Data Engineer

Paramus NJ

Only FTE or Contract to hire (Visa transfer is fine for CTH)

Detailed JD:

Designs builds and optimizes secure scalable and high-performance data pipelines and analytics solutions using Google Cloud Platform tools like BigQuery Dataflow Dataproc Composer GCS Cloud function Cloud run and Pub/Sub. This role requires 5 years of experience expertise in Python/SQL and implementing data governance and CI/CD pipelines.

Key Responsibilities

Pipeline Development: Design build and optimize end-to-end data pipelines using GCP-native services (Dataflow Dataproc Cloud Storage) and Python.

Data Modeling & Architecture: Create high-quality reproducible data models in BigQuery using partitioning clustering and materialized views to enhance performance and manage costs.

Streaming & Real-time: Implement real-time streaming pipelines using Pub/Sub and Apache Beam/Spark Streaming.

Infrastructure & DevOps: Establish CI/CD pipelines for data workflows.

Security & Governance: Implement best practices for data security including IAM roles encryption (CMEK) and VPC Service Controls.

Collaboration: Work with stakeholders to define requirements mentor junior engineers and produce technical documentation.

Required Technical Skills

Languages: Strong SQL and Python proficiency.

Platforms: Deep expertise in Google Cloud Platform (GCP).

Tools: BigQuery Dataflow Cloud Composer (Airflow) Pub/Sub Cloud Storage Dataproc Cloud sql Cloud run Cloud function logging and monitoring

Data Modeling: Database design ETL/ELT workflows.

Qualifications

Experience: 5 years in GCP data engineering

Must have delivered at least 2 to 3 end to end projects as a data engineer using GCP Services

Strong SQL and Python proficiency.

Strong understanding of database design data modeling (relational dimensional NoSQL).

Expertise in data integration ETL/ELT and data pipeline development.

Knowledge of cloud security best practices identity management and networking.

Familiarity with DevOps CI/CD and containerization (Docker Kubernetes).

Excellent communication and problem-solving skills.

Education: Bachelors degree in computer science Engineering or relevant field.

Certifications: Google Cloud Professional Data Engineer certification is highly preferred

GCP Data Engineer Paramus NJ Only FTE or Contract to hire (Visa transfer is fine for CTH) Detailed JD: Designs builds and optimizes secure scalable and high-performance data pipelines and analytics solutions using Google Cloud Platform tools like BigQuery Dataflow Dataproc Composer GCS Clo...
View more view more