For one of our ongoing project we are looking for a ETL & Big Data developer.
This role requires a wide variety of strengths and capabilities including:
- 5 years experience on ETL Development Tools: Spark / Java (preferred) Python Ab Initio Informatica
- Advanced experience & demonstrated proficiency in Core Java and Spark.
- Experience with Apache Beam Data Flow Cloud Data Flow Cloud Composer and Big Query
- 5 years experience in Big Data technologies Distributed Multitier Application Development Database Design Data processing Data Warehouse.
- Requires strong ability writing/interpreting/tuning/debugging complex SQL queries and stored procedures.
- Strong data profiling data analysis and data validations skills.
- Deep DBMS experience on multiple platforms (Incl. Oracle Teradata and SQL Server).
- Advanced understanding of data warehousing ETL concepts (esp. change data capture).
- Strong experience on database design best architecture practices normalization and dimensional modeling etc.
- Working experience as Agile developer and good understanding of SDLC methodologies/guidelines.
- Experience at developing complex UNIX shell scripts.
- Experience with GIT or similar source code versioning tools and coding standards.
- Experience with scheduling tool such as Autosys or similar tools.
- Experience documenting business requirements functional specifications and test plans.Ability to collaborate with highperforming teams and individuals throughout the firm to accomplish common goals.Eagerness to learn new technologies.
- Experience in Banking Industry