Job Title: Data Modeler/Data Architect
Location: Dallas TX (Onsite)
Job Type: Contract
Responsibilities:
- Design high-quality domain (canonical) data models consumption-layer dimensional models ensuring compliance with Client data architecture and governance standards.
- Collaborate with business and technical stakeholders to define document and govern how data flows across the organization supporting analytics reporting operational use cases and advanced data science initiatives.
- Work under the guidance of Client Core Data Architects to create data models for the different layers (Domain and Consumption) of our enterprise data platform hosted in Databricks on Azure.
- Collaborate with Application Data SMEs to understand the complete structure and business definition of the source data.
- Collaborate with business analysts and business analytics SMEs to understand the BUS Matrix and associated requirements definition for the data needs.
- Create and maintain data model using ER Studio tool complete with DDL generation.
- Define end to end data lineage (S2T) and document them on Confluence pages as per given Client standards.
- Generate SQL query snippets for explaining data transformation logic.
- Profile and analyze source data to ensure data quality and recommend data refinement and cleansing methods.
- Perform necessary reviews and obtain sign offs prior to delivery to the DEV and QA teams.
- Perform handoff walkthroughs of model and S2T to the DEV team and participate in design review sessions for any refinements as necessary.
- Perform handoff walkthroughs of model and S2T to the QA team and participate in defect triage sessions as necessary.
- Build SQL queries for end consumption business views in alignment with the business requirements.
- Participate with business UAT teams to clarify data model questions and business view questions.
- Provide any other necessary support through the engineering build QA and UAT phases.
- Domain Layer:
- Data Model & DDL: Logical and Physical Model in ER Studio including DDL
- S2T: Data lineage documentation transformation logic DQ refinement and cleansing logic query snippets.
- Consumption Layer:
- Data Model & DDL: Logical and Physical Model in ER Studio including DDL
- S2T: Data lineage documentation transformation logic DQ refinement and cleansing logic query snippets.
- BA or BS degree with a STEM major is preferred. Relevant advanced degree is a plus.
- 5-10 years in data architecture data modeling or enterprise analytics.
- Strong experience with canonical / domain modeling dimensional modeling and semantic modeling.
- Proficiency in documenting data lineage and metadata (experience with Purview Collibra Alation or native Databricks capabilities will be helpful).
- Experience working with Data Modeling tools (e.g. ER/Studio is preferred else Erwin or similar will be helpful)
- Current with industry standards (DAMA-DMBOK Data Vault Kimball/Star Schema Data Mesh principles)
- Experience in life insurance annuities policy administration claims actuarial or related financial services data will be excellent.
- Understanding of ACORD data standards product hierarchies distribution channels customer/party models and regulatory reporting will be an added advantage.
- Any prior experience with the conceptual understanding of IBM IIW Teradata ILDM Oracle OIDF etc. will be helpful
- Proficiency in defining SQL queries is a must
- Experience working with Azure Databricks (Delta Lake Unity Catalog DBFS SQL endpoints) will be an added advantage
- Familiarity with Data Mesh Kimball Inmon Data Vault and Lakehouse modeling patterns.
- Excellent communication and ability to translate complex concepts for non-technical audiences.
- Ability to lead architecture discussions and influence stakeholders.
- Comfortable working in agile delivery environments.
- Strong documentation habits and detail orientation.
- Experience with data contracts API modeling or event-driven architecture will be helpful
- Experience working with Alteryx and/or Tableau will be an added advantage
- Experience with SSIS will be helpful
- Knowledge of Python will be helpful
- TBD may depend from project to project
Skills:
- Data Modeler
- DDL
- Data Bricks Data Lake
- S2T (Source-to-target)
- ER Studio Erwin
- Data Vault Kimball Star Schema
- Teradata IBM
- SQL
Job Title: Data Modeler/Data Architect Location: Dallas TX (Onsite) Job Type: Contract Responsibilities: Design high-quality domain (canonical) data models consumption-layer dimensional models ensuring compliance with Client data architecture and governance standards. Collaborate with business ...
Job Title: Data Modeler/Data Architect
Location: Dallas TX (Onsite)
Job Type: Contract
Responsibilities:
- Design high-quality domain (canonical) data models consumption-layer dimensional models ensuring compliance with Client data architecture and governance standards.
- Collaborate with business and technical stakeholders to define document and govern how data flows across the organization supporting analytics reporting operational use cases and advanced data science initiatives.
- Work under the guidance of Client Core Data Architects to create data models for the different layers (Domain and Consumption) of our enterprise data platform hosted in Databricks on Azure.
- Collaborate with Application Data SMEs to understand the complete structure and business definition of the source data.
- Collaborate with business analysts and business analytics SMEs to understand the BUS Matrix and associated requirements definition for the data needs.
- Create and maintain data model using ER Studio tool complete with DDL generation.
- Define end to end data lineage (S2T) and document them on Confluence pages as per given Client standards.
- Generate SQL query snippets for explaining data transformation logic.
- Profile and analyze source data to ensure data quality and recommend data refinement and cleansing methods.
- Perform necessary reviews and obtain sign offs prior to delivery to the DEV and QA teams.
- Perform handoff walkthroughs of model and S2T to the DEV team and participate in design review sessions for any refinements as necessary.
- Perform handoff walkthroughs of model and S2T to the QA team and participate in defect triage sessions as necessary.
- Build SQL queries for end consumption business views in alignment with the business requirements.
- Participate with business UAT teams to clarify data model questions and business view questions.
- Provide any other necessary support through the engineering build QA and UAT phases.
- Domain Layer:
- Data Model & DDL: Logical and Physical Model in ER Studio including DDL
- S2T: Data lineage documentation transformation logic DQ refinement and cleansing logic query snippets.
- Consumption Layer:
- Data Model & DDL: Logical and Physical Model in ER Studio including DDL
- S2T: Data lineage documentation transformation logic DQ refinement and cleansing logic query snippets.
- BA or BS degree with a STEM major is preferred. Relevant advanced degree is a plus.
- 5-10 years in data architecture data modeling or enterprise analytics.
- Strong experience with canonical / domain modeling dimensional modeling and semantic modeling.
- Proficiency in documenting data lineage and metadata (experience with Purview Collibra Alation or native Databricks capabilities will be helpful).
- Experience working with Data Modeling tools (e.g. ER/Studio is preferred else Erwin or similar will be helpful)
- Current with industry standards (DAMA-DMBOK Data Vault Kimball/Star Schema Data Mesh principles)
- Experience in life insurance annuities policy administration claims actuarial or related financial services data will be excellent.
- Understanding of ACORD data standards product hierarchies distribution channels customer/party models and regulatory reporting will be an added advantage.
- Any prior experience with the conceptual understanding of IBM IIW Teradata ILDM Oracle OIDF etc. will be helpful
- Proficiency in defining SQL queries is a must
- Experience working with Azure Databricks (Delta Lake Unity Catalog DBFS SQL endpoints) will be an added advantage
- Familiarity with Data Mesh Kimball Inmon Data Vault and Lakehouse modeling patterns.
- Excellent communication and ability to translate complex concepts for non-technical audiences.
- Ability to lead architecture discussions and influence stakeholders.
- Comfortable working in agile delivery environments.
- Strong documentation habits and detail orientation.
- Experience with data contracts API modeling or event-driven architecture will be helpful
- Experience working with Alteryx and/or Tableau will be an added advantage
- Experience with SSIS will be helpful
- Knowledge of Python will be helpful
- TBD may depend from project to project
Skills:
- Data Modeler
- DDL
- Data Bricks Data Lake
- S2T (Source-to-target)
- ER Studio Erwin
- Data Vault Kimball Star Schema
- Teradata IBM
- SQL
View more
View less