Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
Short Description:
Under general supervision combines strong technical skills with knowledge of the database administration. Works on one or more projects of high complexity.
Job Description:
The Dept. of Early Care & Development (DECAL) is seeking a highly skilled and proactive Data Engineer to join our dynamic team and support the modernization of our data estate. This role is integral to the migration from legacy systems and the development of scalable secure and efficient data solutions using modern technologies particularly Microsoft Fabric and Azure-based platforms. The successful candidate will contribute to data infrastructure design data modeling pipeline development and visualization delivery to enable data-driven decision-making across the enterprise.
Work Location & Attendance Requirements:
Must be physically located in metro Atlanta.
On-site: Tuesday to Thursday per manager s discretion
Mandatory in-person meetings:
All Hands
Enterprise Applications
On-site meetings
DECAL All Staff
Work arrangements are subject to management s discretion
Key Responsibilities:
Design build and maintain scalable ETL/ELT data pipelines using Microsoft Fabric and Azure Databricks.
Implement medallion architecture (Bronze Silver Gold) to support data lifecycle and data quality.
Support the sunsetting of legacy SQL-based infrastructure and SSRS ensuring data continuity and stakeholder readiness.
Create and manage notebooks (e.g. Fabric Notebooks Databricks) for data transformation using Python SQL and Spark.
Build and deliver curated datasets and analytics models to support Power BI dashboards and reports.
Develop dimensional and real-time data models for analytics use cases.
Collaborate with data analysts stewards and business stakeholders to deliver fit-for-purpose data assets.
Apply data governance policies including row-level security data masking and classification in line with Microsoft Purview or Unity Catalog.
Ensure monitoring logging and CI/CD automation using Azure DevOps for data workflows.
Provide support during data migration and cutover events ensuring minimal disruption.
Technical Stack:
Microsoft Fabric
Azure Databricks
SQL Server / SQL Managed Instances
Power BI (including semantic models and datasets)
SSRS (for legacy support and decommissioning)
Qualifications:
Bachelor s degree in Computer Science Information Systems or related field
5 years of experience in data engineering roles preferably in government or regulated environments
Proficiency in SQL Python Spark.
Hands-on experience with Microsoft Fabric (Dataflows Pipelines Notebooks OneLake)
Experience with Power BI data modeling and dashboard development
Familiarity with data governance tools (Microsoft Purview Unity Catalog)
Solid understanding of ETL/ELT pipelines data warehousing concepts and schema design
Strong communication and collaboration skills.
Preferred Qualifications:
Certifications such as Microsoft Certified: Fabric Analytics Engineer or Azure Data Engineer Associate
Knowledge of CI/CD automation with Azure DevOps
Familiarity with data security and compliance (e.g. FIPS 199 NIST)
Experience managing sunset and modernization of legacy reporting systems like SSRS
Soft Skills:
Strong analytical thinking and problem-solving abilities
Ability to collaborate across multidisciplinary teams
Comfort in fast-paced and evolving technology environments
This role is critical to our shift toward a modern data platform and offers the opportunity to influence our architectural decisions and technical roadmap.
Skill Matrix:-
Skill
Required / Desired
Amount
of Experience
Experience in data engineering roles preferably in government or regulated environments
Required
Years
Hands-on experience with Microsoft Fabric (Dataflows Pipelines Notebooks OneLake)
Required
Years
Experience with Power BI data modeling and dashboard development
Required
Years
Familiarity with data governance tools (Microsoft Purview Unity Catalog)
Required
Years
Solid understanding of ETL/ELT pipelines data warehousing concepts and schema design
Required
Years
Bachelor s degree in Computer Science Information Systems or related field
Required
Full-time