GCP Data Engineer | Java & PySpark
Job Summary
Job Summary
Synechron is seeking a talented and dedicated Data Engineer to join our Customer Personalization team. This role is critical in developing and maintaining advanced data systems and microservices that drive our global marketing and customer engagement initiatives. As a key member of the team you will leverage your expertise in large-scale data engineering Java and cloud technologies on GCP to enable innovative scalable solutions that enhance personalization capabilities and business outcomes.
Software Requirements
Required:
- Java (preferably version 8 or newer) including experience with Spring Boot framework
- SQL and experience working with relational databases and query optimization
- PySpark DataProc BigQuery on Google Cloud Platform (GCP)
- UNIX/Linux shell scripting Python Perl (basic to intermediate knowledge)
- Development of RESTful web services
- Version control systems such as Git
Preferred:
- GCP-native tools like Dataflow Cloud Composer Cloud Storage
- Hadoop Spark Hive for data processing and migration
- Machine learning libraries (TensorFlow Scikit-learn)
- Experience with Adobe Marketing Campaign tools
Overall Responsibilities
- Develop test and maintain microservices and APIs using Java (Spring Boot) deployed on GKE (Google Kubernetes Engine).
- Design build and optimize large-scale data pipelines using PySpark DataProc and BigQuery.
- Migrate existing workloads from Hadoop Spark and Hive to GCP cloud infrastructure ensuring scalability and efficiency.
- Collaborate within Agile teams to deliver high-quality software modules adhering to best practices and documentation standards.
- Manage deployment support and performance monitoring of software across various stages from testing to production.
- Engage with business stakeholders to analyze requirements sketch solutions and implement reliable data-driven features.
Technical Skills (By Category)
Programming Languages:
- Required: Java (Spring Boot multithreading collections)
- Required: SQL Python PySpark
- Preferred: Shell scripting Perl
Databases/Data Management:
- BigQuery DataProc Hadoop Hive Spark
Cloud Technologies:
- GCP (DataProc BigQuery GKE Cloud Storage)
Frameworks and Libraries:
- Spring Boot REST API development PySpark Dataflow
Development Tools & Methodologies:
- Version control (Git) Agile development CI/CD best practices
Security & Protocols:
- REST API security practices data governance
Experience Requirements
- Minimum 6 years of relevant experience in software and data engineering roles.
- Demonstrated experience with designing and deploying scalable data pipelines and microservices.
- Extensive hands-on expertise in Java Spring Boot SQL and cloud-based data tools on GCP.
- Scripting skills in UNIX/Linux environments using shell Python or Perl.
- Proven ability working collaboratively within Agile teams translating business needs into technical solutions.
Day-to-Day Activities
- Develop test and optimize microservices and data pipelines on GCP platforms.
- Engage in daily Agile ceremonies sprint planning and peer code reviews.
- Collaborate closely with product managers and business analysts to understand requirements and deliver solutions.
- Monitor application and data pipeline performance troubleshoot issues and implement improvements.
- Generate technical documentation and ensure compliance with best practices.
- Support deployment activities and provide ongoing support for live systems.
Qualifications
- Bachelors degree in Computer Science Engineering or a related technical discipline (or equivalent work experience).
- Certifications in GCP (such as GCP Professional Data Engineer) are advantageous.
- Demonstrated commitment to continuous professional development and staying current with emerging technologies.
- Experience working in high-performance data-driven environments.
Professional Competencies
- Strong analytical and problem-solving skills.
- Effective communicator with ability to work collaboratively across teams.
- Results-focused with an emphasis on delivering high-quality maintainable solutions.
- Adaptable to evolving project scopes and technological advancements.
- Innovative mindset eager to explore new tools and approaches.
- Skilled in time management and prioritization to meet project deadlines.
SYNECHRONS DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.
All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.
Required Experience:
IC
About Company
Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more