Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailYour mission will be to develop test and deploy technical and functional specifications from the Solution Designers/Business Architects/Business Analysts guaranteeing the correct operability and the compliance with the internal quality levels. We need somebody like you to help us in different fronts:
You will develop endtoend ETL processes with Spark & Scala. This includes transferring data from/to the data lake technical validations business logic etc.
You will use Scrum methodology and be part of a high performance team
You will document your solutions in: JIRA Confluence ALM
You will certify your delivery and its integration with other components designing and performing the relevant test to ensure the quality of your team delivery.
Qualifications :
Required qualifications:
At least 2 years of experience working with Spark and Scala
Experience in working with big data Hadoop Hive; knowledge of Azure Databricks is considered as a plus.
Experience and expertise across data integration and data management with high data volumes.
Knowledge of good practices in writing code: clean code software design patterns functional style of writing code TDD.
Experience working in Agile continuous integration/DevOps paradigm and tool set (Git GitHub Jenkins Sonar Nexus Jira)
Experience with different database structures including (Postgres SQL Hive).
English (at least B2.
Preferred qualifications:
CI/CD: Jenkins GitHub Actions
Orchestration: ControlM Airflow
Scripting: Bash Python
Software development life cycle (HP ALM..
Basics of cybersecurity & Quality (Sonar Fortify)
Basics of Cloud computing (Docker Kubernetes OS3 Azure AWS)
Additional Information :
Remote Work :
Yes
Employment Type :
Fulltime
Remote