Note- This role required fulltime only without Sponsorship required.
Data Engineering:
Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles and concepts.
Technical expertise working in large scale Data Warehousing applications and databases such as Oracle Netezza and SQL Server.
Experience with public cloud-based data platforms especially Snowflake and AWS.
Data integration skills:
Expertise in design and development of complex data pipelines
Solutions using any industry leading ETL tools such as IBM Data Stage SAP Business Objects Data Services (BODS) Informatica Cloud Data Integration Services (IICS)
Experience of ELT tools such as DBT Fivetran and AWS Glue for future state
Expert in SQL - development experience in at least one scripting language (Python etc.) adept in tracing and resolving data integrity issues.
Strong knowledge of data architecture data design patterns modeling and cloud data solutions (Snowflake AWS Redshift)
Data Model: Expertise in Logical and Physical Data Model using Relational or Dimensional Modeling practices high volume ETL/ELT processes.
Performance tuning of data pipelines and DB Objects to deliver optimal performance
Experience in Gitlab version control and CI/CD processes
SDLC and Agile
Experience working in Financial Industry is a plus
- #L1- AS