Responsible for designing building and maintaining data pipelines that supports data integrations for Enterprise Data Warehouse Operational Data Store or Data Marts etc. with following FCB defined guidelines.
Technical/BusinessSkills:
- Data Engineering:
- SAS ETL skills for current SAS script maintenance and minor enhancements but should re ready to also use other ETL tools listed below for future state
- Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles and concepts.
- Technical expertise working in large scale Data Warehousing applications and databases such as Oracle Netezza Teradata and SQL Server.
- Experience with public cloud-based data platforms especially Snowflake and AWS.
- Data integration skills:
- Expertise in design and development of complex data pipelines
- Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS) Informatica Cloud Data Integration Services (IICS) IBM Data Stage.
- Experience of ELT tools such as DBT Fivetran and AWS Glue
- Expert in SQL - development experience in at least one scripting language (Python etc.) adept in tracing and resolving data integrity issues.
- Strong knowledge of data architecture data design patterns modeling and cloud data solutions (Snowflake AWS Redshift Google BigQuery).
- Data Model: Expertise in Logical and Physical Data Model using Relational or Dimensional Modeling practices high volume ETL/ELT processes.
- Performance tuning of data pipelines and DB Objects to deliver optimal performance.
Experience in Gitlab version control and CI/CD