Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
The client you will be working for is a market leader of the German banking industry. The purpose of the project is to migrate data from various source systems to Google Cloud Platform within an agile team of Data and Java Engineers (junior and senior) working closely with the client in 2week SCRUM sprints.
Responsibilities:
Drive Data Efficiency: Create and maintain optimal data transformation pipelines.
Master Complex Data Handling: Work with large complex financial data sets to generate outputs that meet functional and nonfunctional business requirements.
Lead Innovation and Process Optimization: Identify design and implement process improvements such as automating manual processes optimizing data delivery and redesigning infrastructure for higher scalability.
Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources using opensource technologies.
Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics.
Collaborate with CrossFunctional Teams: Work clients and internal stakeholders including Senior Management Departments Heads Product Data and Design teams to assist with datarelated technical issues and guide their data infrastructure needs
Qualifications :
Must have
8 years of experience in a similar role preferably within Agile teams
Strong experience with objectoriented programming languages especially Java or Scala
Indepth knowledge of the Java ecosystem including frameworks such as Spring Spring Boot and Spring Data as well as associated patterns (e.g. SOLID principles Design Patterns) and APIdriven development
Proficiency in SQL and experience working with relational databases such as MySQL or PostgreSQL
Experience with Bash shell scripting in a Linux environment
Familiarity with cloud service providers especially Google Cloud Platform; Azure or AWS are also acceptable
Solid understanding of microservice architecture
Experience with testdriven design (TDD)
Willing to develop
Understanding of functional programming (Scala)
Exposure to Terraform and CI/CD pipelines (Jenkins or similar)
Familiarity with Apache Spark
Interest in GCP services (Dataproc BigQuery Pub/Sub Cloud Functions)
Working knowledge of highly scalable big data datastores
Additional Information :
Enjoy our holistic benefits program that covers the four pillars that we believe come together to support our wellbeing covering social physical emotional wellbeing as well as worklife fusion.
Remote Work :
Yes
Employment Type :
Fulltime
Remote