drjobs Senior GCP Data Engineer

Senior GCP Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Manhattan, NY - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Location : NYC

Duration : 12 months

RoleSpecific Technical Requirements (Based on Interview Questions)

Deep understanding of SQL compression techniques and optimization strategies.

Expert knowledge of GCP architecture core services and best practices.

Ability to design and implement Pub/Subbased solutions for realtime highvolume data ingestion.

Proven approach to ensuring lowlatency data processing in distributed environments.

Demonstrated experience implementing security and access controls within GCP using IAM and security tools.

Handson experience using GCP Stackdriver for pipeline troubleshooting and operational visibility.

Indepth knowledge of GCP services including:

    1. BigQuery (storage performance tuning querying)
    2. Dataflow (stream and batch processing)
    3. Dataproc (Hadoop/Spark clusters on GCP)
    4. Pub/Sub (asynchronous messaging and streaming)

Mastery of Python and SQL as scripting and query languages for data manipulation validation and ETL processes.

Key Responsibilities

Design build and optimize endtoend data pipelines using GCPnative services such as Dataflow Dataproc and Pub/Sub.

Implement data ingestion transformation and processing workflows using Apache Beam Apache Spark and scripting in Python.

Manage and optimize data storage using BigQuery Cloud Storage and Cloud SQL to ensure performance scalability and costefficiency.

Enforce enterprisegrade data security and access controls using GCP IAM and Cloud Security Command Center.

Monitor and troubleshoot data pipelines using Stackdriver and Cloud Monitoring to ensure high availability and low latency.

Collaborate closely with analysts data scientists and crossfunctional product teams to understand business needs and deliver robust data solutions.

Required Skills and Qualifications

  • 12 years of overall IT experience with deep specialization in data engineering.
  • 8 years of handson experience designing building and maintaining data pipelines in enterprise environments.
  • 5 years of recent experience working with Google Cloud Platform (GCP)specifically within a major U.S. bank or brokerage firm (required no exceptions).
  • Strong expertise in:
  • GCP services: Dataflow Dataproc Pub/Sub BigQuery Cloud Storage Cloud SQL.
  • Data processing frameworks: Apache Beam and Apache Spark.
  • Scripting and automation: Advanced proficiency in Python and SQL for data manipulation transformation and querying.
  • Proven experience implementing GCP IAM policies and managing data access/security at scale.
  • Demonstrated ability to ensure lowlatency highthroughput data systems through performance tuning and best practices.
  • Deep understanding of data compression storage optimization and costeffective cloud design.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.