Title: Data Network Engineer
Location: Atlanta Georgia Hybrid Local Candidates only
Duration: Long Term Contract
Type: W2 & C2C
Experience: 5 years
Interview: Either Web Cam or In Person.
Job Description:
- The Client is seeking a highly skilled and proactive Data Engineer to join our dynamic team and support the modernization of our data estate.
- This role is integral to the migration from legacy systems and the development of scalable secure and efficient data solutions using modern technologies particularly Microsoft Fabric and Azure-based platforms.
- The successful candidate will contribute to data infrastructure design data modeling pipeline development and visualization delivery to enable data-driven decision-making across the enterprise.
Work Location & Attendance Requirements:
- Must be physically located in metro Atlanta.
- On-site: Tuesday to Thursday per managers discretion
- Mandatory in-person meetings:
- All Hands
- Enterprise Applications
- On-site meetings
- DECAL All Staff
- Work arrangements subject to managements discretion
Experience Required:
Key Responsibilities:
- Design build and maintain scalable ETL/ELT data pipelines using Microsoft Fabric and Azure Databricks.
- Implement medallion architecture (Bronze Silver Gold) to support data lifecycle and data quality.
- Support the sunsetting of legacy SQL-based infrastructure and SSRS ensuring data continuity and stakeholder readiness.
- Create and manage notebooks (e.g. Fabric Notebooks Databricks) for data transformation using Python SQL and Spark.
- Build and deliver curated datasets and analytics models to support Power BI dashboards and reports.
- Develop dimensional and real-time data models for analytics use cases.
- Collaborate with data analysts stewards and business stakeholders to deliver fit-for-purpose data assets.
- Apply data governance policies including row-level security data masking and classification in line with Microsoft Purview or Unity Catalog.
- Ensure monitoring logging and CI/CD automation using Azure DevOps for data workflows.
- Provide support during data migration and cutover events ensuring minimal disruption.
Technical Stack:
- Microsoft Fabric
- Azure Databricks
- SQL Server / SQL Managed Instances
- Power BI (including semantic models and datasets)
- SSRS (for legacy support and decommissioning)
Qualifications:
- Bachelors degree in Computer Science Information Systems or related field
- 5 years of experience in data engineering roles preferably in government or regulated environments
- Proficiency in SQL Python Spark.
- Hands-on experience with Microsoft Fabric (Dataflows Pipelines Notebooks OneLake)
- Experience with Power BI data modeling and dashboard development
- Familiarity with data governance tools (Microsoft Purview Unity Catalog)
- Solid understanding of ETL/ELT pipelines data warehousing concepts and schema design
- Strong communication and collaboration skills.
Preferred Qualifications:
- Certifications such as Microsoft Certified: Fabric Analytics Engineer or Azure Data Engineer Associate
- Knowledge of CI/CD automation with Azure DevOps
- Familiarity with data security and compliance (e.g. FIPS 199 NIST)
- Experience managing sunset and modernization of legacy reporting systems like SSRS
Soft Skills:
- Strong analytical thinking and problem-solving abilities
- Ability to collaborate across multidisciplinary teams
- Comfort in fast-paced and evolving technology environments
This role is critical to our shift toward a modern data platform and offers the opportunity to influence our architectural decisions and technical roadmap.
Required/Desired Skills Skill | Required /Desired | Amount | of Experience |
Experience in data engineering roles preferably in government or regulated environments | Required | 5 | Years |
Hands-on experience with Microsoft Fabric (Dataflows Pipelines Notebooks OneLake) | Required | 5 | Years |
Experience with Power BI data modeling and dashboard development | Required | 5 | Years |
Familiarity with data governance tools (Microsoft Purview Unity Catalog) | Required | 5 | Years |
Solid understanding of ETL/ELT pipelines data warehousing concepts and schema design | Required | 5 | Years |
Bachelors degree in Computer Science Information Systems or related field | Required | 0 | |