At TTEC Digital we coach clients to ensure their employees feel valued and fully supported because an amazing customer experience is an employee first process. Our vision is the same a place where employees know they can thrive.
Position Purpose
The GCP Data Engineer will be part of the Data & Analytics team responsible for designing building and maintaining scalable data pipelines and cloud-based data platforms on Google Cloud Platform (GCP).
This role focuses on transforming raw data into reliable and efficient data systems that enable advanced analytics reporting and data-driven decision-making.
The Data Engineer will collaborate with Data Scientists Architects Project Managers and Business stakeholders on implementation migration modernization and optimization initiatives.
Key Responsibilities
Design build and maintain scalable data pipelines on GCP
Develop and optimize batch and real-time data processing solutions
Build and manage data lakes and data warehouses
Develop and maintain datasets for analytics and reporting
Improve data quality reliability and efficiency
Create solution design and technical documentation
Work independently and as part of cross-functional teams
Assist in pre-sales activities by providing effort estimates and technical inputs
Collaborate with Project Management to ensure on-time on-budget delivery
Implement DevOps and CI/CD practices for data pipelines
Competencies
Personal
Strong analytical and problem-solving skills
High energy ownership mindset and accountability
Flexible and results-oriented
Strong communication and stakeholder management skills
Leadership
Ability to guide junior team members
Work effectively with diverse technical and business teams
Provide technical mentoring and best practices
Operations
Manage multiple projects and priorities effectively
Deliver customer-focused solutions
Optimize performance and resource utilization
Technical
Strong understanding of cloud-native data architecture
Ability to communicate complex technical concepts clearly
Experience with Agile/Scrum methodologies
Technical Skills
GCP Core Services
BigQuery
Cloud Storage
Cloud Dataflow
Cloud Dataproc
Pub/Sub
Cloud Composer (Airflow)
Cloud Functions
Cloud Run
Programming & Data
Python (Pandas NumPy PySpark)
SQL (advanced querying and optimization)
Spark (Batch & Streaming)
Data Engineering & Integration
ETL/ELT pipeline development
Data modeling (Star & Snowflake schema)
Real-time streaming architectures
API integrations and REST services
DevOps & Containerization
CI/CD (GitHub Actions Jenkins or similar)
Docker
Kubernetes
Infrastructure as Code (Terraform preferred)
Education Experience and Certification
Bachelors degree in Computer Science Engineering MIS or related field
58 years of experience in Data Engineering
3 years of hands-on experience with Google Cloud Platform
2 years of experience building containerized applications
Strong experience in designing and deploying data pipelines in cloud environments
Google Cloud Professional Data Engineer certification (Preferred)