About the role: The Data Modeler designs fit-for-purpose conceptual logical and physical data models in-line with business requirements to serve analytical business intelligence and operational use cases primarily within the data platform.
Key accountabilities& responsibilities:
- Elicit analyze and document data requirements in support of analytical business intelligence warehousing and other business use cases for the data platform using a range of techniques including source data analysis documentation interviews and data modelling workshops
- Create and maintain data models appropriate to the need and that conform to data modelling standards.
- Produce and maintain metadata (including relationships calculation logic etc.) and documentation to accompany data models using data modelling tools where appropriate.
- Ensure models will provide data structures that meet the required range of non-functional requirements including performance extensibility change capture (SCD etc) understandability and maintainability.
- Create and maintain specifications for data transformation in both documentation (Source To Target Mapping) and scripting.
- Agree artefacts models documentation and scripts with relevant business owners stewards SMEs and technical stakeholders.
- Test models and transformation scripts to ensure they meet requirements.
- Perform day-to-day data model and script maintenance to tune performance respond to changes and in support of IT issues.
- Advise data engineers visualization developers and other consumers of models and data specifications in the interpretation of data models and structures and the understanding of data requirements.
- Contribute to the definition of data dictionaries and business glossaries.
Partner with and support adjacent teams.
Knowledge/Experience
- Proven knowledge of physical and logical data modelling in a data warehouse environment including the successful creation of conformed dimensional models from a range of legacy source systems alongside modern SaaS/Cloud business applications
- Experience within a similar role within insurance (ideally health insurance) or similar complex and regulated industry and able to demonstrate a sound working business knowledge of its operation.
- Experienced at capturing technical and business metadata including being able to elicit and create sound definitions for entities and attribute.
- Practiced and able to query data from source or raw data and reverse engineer an underlying data model and data definitions.
- Experienced in writing scripts for data transformation using SQL DDL DML and Pyspark.
- Good knowledge and exposure to software development lifecycles and good engineering practices
- Can demonstrate a good working knowledge of data modelling patterns and when to use them.
- Technical skills
- Entity relationship dimensional and NOSQL modelling as appropriate to data warehousing business intelligence and analytical approaches using IE or other common notations.
- SQL DDL DML and Pyspark scripting
- ERWIN and Visio data modelling/UML tool
- Ideally Azure Data Factory Azure Dev Ops and Databricks
Required Experience:
Contract