The client you will be working for is a market leader of the German banking industry. The purpose of the project is to migrate data from various source systems to Google Cloud Platform within an agile team of Data and Java Engineers (junior and senior) working closely with the client in 2-week SCRUM sprints.
Responsibilities:
- Drive Data Efficiency: Create and maintain optimal data transformation pipelines.
- Master Complex Data Handling: Work with large complex financial data sets to generate outputs that meet functional and non-functional business requirements.
- Lead Innovation and Process Optimization: Identify design and implement process improvements such as automating manual processes optimizing data delivery and re-designing infrastructure for higher scalability.
- Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources using open-source technologies.
- Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics.
- Collaborate with Cross-Functional Teams: Work clients and internal stakeholders including Senior Management Departments Heads Product Data and Design teams to assist with data-related technical issues and guide their data infrastructure needs
Qualifications :
Must have
- 8 years of experience in a similar role preferably within Agile teams
- Strong experience with object-oriented programming languages especially Java or Scala
- In-depth knowledge of the Java ecosystem including frameworks such as Spring Spring Boot and Spring Data as well as associated patterns (e.g. SOLID principles Design Patterns) and API-driven development
- Proficiency in SQL and experience working with relational databases such as MySQL or PostgreSQL
- Experience with Bash shell scripting in a Linux environment
- Familiarity with cloud service providers especially Google Cloud Platform; Azure or AWS are also acceptable
- Solid understanding of microservice architecture
- Experience with test-driven design (TDD)
Willing to develop
- Understanding of functional programming (Scala)
- Exposure to Terraform and CI/CD pipelines (Jenkins or similar)
- Familiarity with Apache Spark
- Interest in GCP services (Dataproc BigQuery Pub/Sub Cloud Functions)
- Working knowledge of highly scalable big data datastores
Additional Information :
Enjoy our holistic benefits program that covers the four pillars that we believe come together to support our wellbeing covering social physical emotional wellbeing as well as work-life fusion.
- Physical Wellbeing: Our wellbeing program includes medical benefits gym support and personalised fitness options for an active lifestyle complemented by team events and the Healthy Habits Club.
- Work-Life Fusion: In very dynamic industries such as IT the line between our professional and personal lives can quickly become blurred. Having a one-size-fits-one approach gives us the flexibility to define the work-life dynamic that works for us.
- Emotional Wellbeing: We believe that to maintain our overall health we need to invest in our mental wellbeing just as much as we do in our physical health social connections or in achieving work-life balance.
- Social Wellbeing: As a growing community in a hybrid environment we want to ensure we remain connected not just by the great work we do every day but through our passions and interests.
Remote Work :
Yes
Employment Type :
Full-time
The client you will be working for is a market leader of the German banking industry. The purpose of the project is to migrate data from various source systems to Google Cloud Platform within an agile team of Data and Java Engineers (junior and senior) working closely with the client in 2-week SCRUM...
The client you will be working for is a market leader of the German banking industry. The purpose of the project is to migrate data from various source systems to Google Cloud Platform within an agile team of Data and Java Engineers (junior and senior) working closely with the client in 2-week SCRUM sprints.
Responsibilities:
- Drive Data Efficiency: Create and maintain optimal data transformation pipelines.
- Master Complex Data Handling: Work with large complex financial data sets to generate outputs that meet functional and non-functional business requirements.
- Lead Innovation and Process Optimization: Identify design and implement process improvements such as automating manual processes optimizing data delivery and re-designing infrastructure for higher scalability.
- Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources using open-source technologies.
- Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics.
- Collaborate with Cross-Functional Teams: Work clients and internal stakeholders including Senior Management Departments Heads Product Data and Design teams to assist with data-related technical issues and guide their data infrastructure needs
Qualifications :
Must have
- 8 years of experience in a similar role preferably within Agile teams
- Strong experience with object-oriented programming languages especially Java or Scala
- In-depth knowledge of the Java ecosystem including frameworks such as Spring Spring Boot and Spring Data as well as associated patterns (e.g. SOLID principles Design Patterns) and API-driven development
- Proficiency in SQL and experience working with relational databases such as MySQL or PostgreSQL
- Experience with Bash shell scripting in a Linux environment
- Familiarity with cloud service providers especially Google Cloud Platform; Azure or AWS are also acceptable
- Solid understanding of microservice architecture
- Experience with test-driven design (TDD)
Willing to develop
- Understanding of functional programming (Scala)
- Exposure to Terraform and CI/CD pipelines (Jenkins or similar)
- Familiarity with Apache Spark
- Interest in GCP services (Dataproc BigQuery Pub/Sub Cloud Functions)
- Working knowledge of highly scalable big data datastores
Additional Information :
Enjoy our holistic benefits program that covers the four pillars that we believe come together to support our wellbeing covering social physical emotional wellbeing as well as work-life fusion.
- Physical Wellbeing: Our wellbeing program includes medical benefits gym support and personalised fitness options for an active lifestyle complemented by team events and the Healthy Habits Club.
- Work-Life Fusion: In very dynamic industries such as IT the line between our professional and personal lives can quickly become blurred. Having a one-size-fits-one approach gives us the flexibility to define the work-life dynamic that works for us.
- Emotional Wellbeing: We believe that to maintain our overall health we need to invest in our mental wellbeing just as much as we do in our physical health social connections or in achieving work-life balance.
- Social Wellbeing: As a growing community in a hybrid environment we want to ensure we remain connected not just by the great work we do every day but through our passions and interests.
Remote Work :
Yes
Employment Type :
Full-time
View more
View less