Datawarehouse ENGINEER

Not Interested
Bookmark
Report This Job

profile Job Location:

Tinton Falls, NJ - USA

profile Monthly Salary: Not Disclosed
Posted on: 5 hours ago
Vacancies: 1 Vacancy

Job Summary

The DW Data Engineer will play a critical role in building enhancing and optimizing enterprise analytics platform. This individual will design and develop ETL/ELT pipelines Lakehouse/Warehouse models and curated datasets that power reporting and analytics. This position works closely with BI Analysts BI Developers Architects and business stakeholders to ensure that high-quality scalable and governed data is made available for decision-making

Role and Responsibilities

Data Engineering & Pipeline Development

Design build and maintain ETL/ELT pipelines using Microsoft Fabric (Pipelines Dataflows Gen2 Notebooks Spark) and legacy SSIS.

Develop ingestion frameworks for flat files (CSV/Excel) APIs SaaS platforms cloud feeds and partner data.

Implement medallion architecture (Bronze Silver Gold) using Lakehouse (Delta Lake) Warehouse and OneLake.

Automate data transformations using SQL PySpark and Fabric Notebooks.

Data Modeling & Optimization

Build and optimize star schema models conformed dimensions and fact tables for BI consumption.

Implement incremental loads SCD handling (Type 1/2) partitioning Z-ordering compaction and other Delta Lake optimization techniques.

Collaborate with BI Analysts to translate business requirements into performant data models.

Data Quality Governance & Security

Ensure end-to-end data quality through validation reconciliations profiling and automated tests.

Apply governance principles using Purview for lineage classification and data cataloging.

Enforce Row-Level Security (RLS) object-level security and access controls across Fabric datasets.

Cross-Team Collaboration

Partner with BI Analysts and Business Stakeholders to understand KPIs metrics and reporting requirements.

Work with Architects to establish data platform standards naming conventions folder structures and version control patterns.

Provide technical expertise during UAT troubleshooting and performance tuning.

Operational Excellence

Monitor pipeline performance and proactively resolve pipeline failures.

Implement CI/CD practices using Azure DevOps / Git integration for code and artifact promotion across Dev Stage and Prod.

Contribute to documentation of data flows data dictionaries technical specifications and workflows.

Qualifications and Education Requirements

Bachelors degree in Computer Science Information Systems Engineering or related field.

5 years of experience in data engineering BI development or data warehouse development.

Strong SQL skills (T-SQL) for complex transforms joins window functions and performance tuning.

Hands-on experience with Microsoft Fabric (Lakehouse Warehouse OneLake Pipelines Dataflows Gen2 Notebooks).

Experience with Delta Lake parquet and medallion architectures.

Proficiency with Python or PySpark for ingestion and transformation.

Experience integrating REST APIs SFTP feeds SaaS connectors and partner files.

Strong understanding of dimensional modeling (Kimball) conformed dimensions and data mart design.

Familiarity with CI/CD workflows (Azure DevOps Git).

Excellent troubleshooting debugging and performance optimization abilities.

3-5 years of experience with SSMS / SSDT / SSIS / SSAS / SSRS.

Preferred Skills

Experience with Power BI (understanding semantic models and performance considerations).

Exposure to Azure Data Factory Synapse or Databricks.

Experience with workflow orchestration and metadata-driven frameworks.

Knowledge of data governance tools (Purview) data security best practices and lineage management.

The DW Data Engineer will play a critical role in building enhancing and optimizing enterprise analytics platform. This individual will design and develop ETL/ELT pipelines Lakehouse/Warehouse models and curated datasets that power reporting and analytics. This position works closely with BI Analyst...
View more view more