AuxoAI drives large-scale data modernization and AI readiness for global enterprises. We are looking for an experienced Data Modeler to design standardize and maintain enterprise data models across our modernization initiatives ensuring consistency quality and business alignment across cloud data platforms.
The person will be responsible for translating business requirements and data flows into robust conceptual logical and physical data models across multiple domains (Customer Product Finance Supply Chain etc.). You will work closely with Data Architects Engineers and Governance teams to ensure data is structured traceable and optimized for analytics and interoperability across platforms like Snowflake Dremio and Databricks.
- Develop conceptual logical and physical data models aligned with enterprise architecture standards.
- Engage with Business Stakeholders: Collaborate with business teams business analysts and SMEs to understand business processes data lifecycles and key metrics that drive value and outcomes.
- Value Chain Understanding: Analyze end-to-end customer and product value chains to identify critical data entities relationships and dependencies that should be represented in the data model.
- Conceptual and Logical Modeling: Translate business concepts and data requirements into conceptual and logical data models that capture enterprise semantics and support analytical and operational needs.
- Physical Data Modeling: Design and implement physical data models optimized for performance and scalability
- Semantic Layer Design: Create semantic models that enable business access to data via BI tools and data discovery platforms.
- Data Standards and Governance: Ensure models comply with enterprise data standards naming conventions lineage tracking and governance practices.
- Implement naming conventions data standards and metadata definitions across all models.
- Collaboration with Data Engineering: Work closely with data engineers to align data pipelines with the logical and physical models ensuring consistency and accuracy from ingestion to consumption.
- Manage version control lineage tracking and change documentation for models.
- Participate in data quality and governance initiatives to ensure trusted and consistent data definitions across domains.
- Create and maintain a business glossary in collaboration with the governance team.
Requirements
- Bachelors or Masters in Computer Science Information Systems or related field.
- 8 years in data modeling or enterprise data architecture.
- Strong experience with modeling tools such as Erwin ER/Studio PowerDesigner or SQLDBM.
- Deep understanding of dimensional modeling (Kimball/Inmon) normalization and schema design for modern warehouses.
- Experience designing models for Snowflake Dremio Redshift or Databricks environments.
- Strong SQL and schema design skills with ability to validate model implementations.
- Familiarity with data governance metadata management and business glossary tools (e.g. Collibra Alation Purview).
- Excellent communication skills able to work with both technical and non-technical stakeholders.
- Experience in high-tech or manufacturing data domains (Customer Product Supply Chain).
- Familiarity with Medallion or Data Vault architectures and how they translate into dimensional models.
- Exposure to data integration pipelines and ETL frameworks (Informatica DBT Airflow).
- Understanding of master data and reference data management principles.
Required Skills:
Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
AuxoAI drives large-scale data modernization and AI readiness for global enterprises. We are looking for an experienced Data Modeler to design standardize and maintain enterprise data models across our modernization initiatives ensuring consistency quality and business alignment across cloud data p...
AuxoAI drives large-scale data modernization and AI readiness for global enterprises. We are looking for an experienced Data Modeler to design standardize and maintain enterprise data models across our modernization initiatives ensuring consistency quality and business alignment across cloud data platforms.
The person will be responsible for translating business requirements and data flows into robust conceptual logical and physical data models across multiple domains (Customer Product Finance Supply Chain etc.). You will work closely with Data Architects Engineers and Governance teams to ensure data is structured traceable and optimized for analytics and interoperability across platforms like Snowflake Dremio and Databricks.
- Develop conceptual logical and physical data models aligned with enterprise architecture standards.
- Engage with Business Stakeholders: Collaborate with business teams business analysts and SMEs to understand business processes data lifecycles and key metrics that drive value and outcomes.
- Value Chain Understanding: Analyze end-to-end customer and product value chains to identify critical data entities relationships and dependencies that should be represented in the data model.
- Conceptual and Logical Modeling: Translate business concepts and data requirements into conceptual and logical data models that capture enterprise semantics and support analytical and operational needs.
- Physical Data Modeling: Design and implement physical data models optimized for performance and scalability
- Semantic Layer Design: Create semantic models that enable business access to data via BI tools and data discovery platforms.
- Data Standards and Governance: Ensure models comply with enterprise data standards naming conventions lineage tracking and governance practices.
- Implement naming conventions data standards and metadata definitions across all models.
- Collaboration with Data Engineering: Work closely with data engineers to align data pipelines with the logical and physical models ensuring consistency and accuracy from ingestion to consumption.
- Manage version control lineage tracking and change documentation for models.
- Participate in data quality and governance initiatives to ensure trusted and consistent data definitions across domains.
- Create and maintain a business glossary in collaboration with the governance team.
Requirements
- Bachelors or Masters in Computer Science Information Systems or related field.
- 8 years in data modeling or enterprise data architecture.
- Strong experience with modeling tools such as Erwin ER/Studio PowerDesigner or SQLDBM.
- Deep understanding of dimensional modeling (Kimball/Inmon) normalization and schema design for modern warehouses.
- Experience designing models for Snowflake Dremio Redshift or Databricks environments.
- Strong SQL and schema design skills with ability to validate model implementations.
- Familiarity with data governance metadata management and business glossary tools (e.g. Collibra Alation Purview).
- Excellent communication skills able to work with both technical and non-technical stakeholders.
- Experience in high-tech or manufacturing data domains (Customer Product Supply Chain).
- Familiarity with Medallion or Data Vault architectures and how they translate into dimensional models.
- Exposure to data integration pipelines and ETL frameworks (Informatica DBT Airflow).
- Understanding of master data and reference data management principles.
Required Skills:
Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
View more
View less