We are seeking an experienced GCP Data Engineer to join our team in Chicago IL. The ideal candidate will have deep expertise in Google Cloud Platform (GCP) with strong hands-on experience in BigQuery SQL ETL pipelines and other GCP data services. This is a fully onsite role requiring excellent collaboration with cross-functional teams in a fast-paced agile environment.
Key Responsibilities:
- Design develop and optimize scalable data pipelines and ETL solutions using GCP tools and services.
- Work extensively with BigQuery Cloud Dataflow Cloud Functions Cloud Storage and Pub/Sub.
- Build and manage workflow orchestration using Apache Airflow (Google Composer).
- Migrate and transform large datasets from various sources to GCP environments.
- Write and optimize complex SQL queries for data extraction and analysis.
- Collaborate with business stakeholders product managers and analysts to prioritize and deliver data solutions.
- Implement Git workflows using tools like GitHub GitLab or Bitbucket.
- Troubleshoot data issues and perform administrative tasks including user setup project configuration and job scheduling.
- Write and test scripts using Unix Shell scripting and PL/SQL.
- Use ServiceNow for change and incident management.
Required Skills & Experience:
- 7 years of overall IT experience with a strong background in data engineering.
- Expertise in BigQuery SQL and ETL processes.
- Master-level proficiency in GCP and associated data services.
- Experience with Cloud Dataflow Cloud Pub/Sub Google Composer and Cloud Functions.
- Solid understanding of Git and experience with at least one cloud-based Git platform.
- Experience working in Agile/Scrum environments.
Nice to Have:
- Basic knowledge of Python or Java.