Job Title: GCP Data engineer
Location: Dallas TX
Duration: / Term: Fulltime
Job Description:
Experience Desired: 10 Years.
We are seeking a highly skilled and experienced GCP Data engineer with deep expertise in data modeling data quality and hands-on experience across core GCP services such as BigQuery DataFlow DataProc and GCP DevOps practices. The ideal candidate will be a thought leader in cloud data architecture capable of designing developing and implementing scalable and robust data pipelines and infrastructure on Google Cloud Platform.
Key Responsibilities:
- Design end-to-end data solutions using Google Cloud Platform.
- Develop data models and optimize data storage solutions for analytical and operational workloads.
- Implement data quality frameworks to ensure consistency accuracy and integrity of data.
- Build scalable ETL/ELT pipelines using GCP-native tools such as DataFlow DataProc and BigQuery.
- Leverage GCP DevOps tools for CI/CD infrastructure-as-code and deployment automation.
- Collaborate with data engineers analysts and business stakeholders to understand requirements and deliver high-quality data products.
- Write robust reusable and optimized code in Python for data processing and orchestration tasks.
- Implement security governance and monitoring for data solutions across GCP.
- Continuously assess and improve system performance and scalability.
Required Skills & Qualifications:
- 8 years of experience in Data Engineering or related roles.
- Strong experience in data modeling (dimensional relational NoSQL) and data quality management.
- Proven expertise in Google Cloud Platform (GCP) particularly:
- BigQuery
- DataFlow
- DataProc
- Cloud Composer Cloud Storage Pub/Sub
- Proficiency in Python for building data workflows and automation.
- Hands-on experience with GCP DevOps tools: Cloud Build Terraform/Deployment Manager Cloud Monitoring and CI/CD pipelines.
- Excellent problem-solving communication and stakeholder management skills.
- Experience with Agile methodologies and collaborative development environments.
Nice to Have:
- GCP Professional certifications (e.g. Professional Data Engineer Professional Cloud Architect).
- Experience with other data processing tools like Apache Beam Spark.
- Exposure to data security and compliance practices in cloud environment
Key Skills:
GCP Python DataFlow DataProc BigQuery data modeling data quality