Scala-Spark developer
Strong knowledge of distributed computing principles and big data technologies such as Hadoop Spark Streaming etc.
Experience with ETL processes and data modelling.
Problem-solving and troubleshooting skills.
Working knowledge on Oozie/Airflow.
Experience in writing unit test cases shell scripting.
Ability to work independently and as part of a team in a fast-paced environment.
Use of version control (Git) and related software lifecycle tooling.
Experience in Spark Streaming/Kafka streaming.
Experience with software development methodologies including Scrum & Agile.
Approaches for optimisation at a system level as well as algorithm implementation level