drjobs Architect (Azure Data Warehouse Developer)

Architect (Azure Data Warehouse Developer)

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Harrisburg, PA - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

LOCATION:

The contractor must reside in PA and will be permitted to work from home.

The contractor is expected to be in the office at least 1 day per month subject to additional days in office at Manager discretion.

Architect (Azure Data Warehouse Developer)

Support of a Data Modernization Initiative with the vision that all public health policies and interventions are driven by data and the mission to provide all internal and external public health decision makers with accessible timely reliable and meaningful data to drive policies and interventions. The Enterprise Data Warehouse (EDW) is responding to DOHs need for centralized data and state of the art data analysis services by modernizing its data portfolio architecture and statistical analysis capabilities aimed at improving public health surveillance interventions future outbreak prevention outcomes and research.

Architect / Azure DW Developer position will support both the existing business and reporting requirements of individual DOH / DDAP systems and program areas and the construction of a modern data warehouse that will serve DOH / DDAP from an enterprise perspective.

The primary objective of this engagement is for the selected candidate to serve as the data warehouse developer supporting the analysis and reporting needs of the DOH / DDAP and the design and construction of a modern EDW in Azure.

This positions scope includes the modernization of DOH operations; plan coordinate and respond to data reporting needs set standards and define framework; assist with large volume data processing statistical analysis of large datasets; revamping the EDW into Microsofts Azure Cloud utilizing Azure Databricks Delta Lake and Synapse including compute storage and application fabric as well as services for infrastructure as a service (IaaS) platform as a service (PaaS) software as a service (SaaS) and serverless technologies; create a centralized data model; support for DOH projects like ELC Enhanced Detection Expansion Data Modernization Initiative PA NEDSS NextGen PA LIMS Replacement Reporting Hub Verato UMPI COVID-19 response and onboarding additional DOH systems into the EDW.

REQUIREMENTS

The Architect is a senior level resource with advanced specialized knowledge and experience in data warehousing database and programming concepts and technology. The selected contractor must have proven experience in the development maintenance testing and maintenance of Azure production systems and projects. This position designs develops tests and implements data lakes databases extract-load-transform programs applications and reports. This position will work with business analysts application developers DBAs network and system staff to achieve project objectives - delivery dates cost objectives quality objectives and program area customer satisfaction objectives.

  • Manage assignments and track progress against agreed upon timelines.
  • Plan organize prioritize and manage work efforts coordinating with the EDW and other teams.
  • Participate in status reviews process reviews deliverable reviews and software quality assurance work product reviews with the appropriate stakeholders.
  • Participate in business and technical requirements gathering.
  • Perform research on potential solutions and provide recommendations to the EDW and DOH.
  • Develop and implement solutions that meet business and technical requirements.
  • Participate in testing of implemented solution(s).
  • Build and maintain relationships with key stakeholders and customer representatives.
  • Give presentations for the EDW other DOH offices and agencies involved with this project.
  • Develops and maintains processes and procedural documentation.
  • Ensure project compliance with relative federal and commonwealth standards and procedures.
  • Conduct training and transfer of knowledge sessions for system and code maintenance.
  • Complete weekly timesheet reporting in PeopleFluent/VectorVMS by COB each Friday.
  • Complete weekly project status updates in Daptiv if necessary. This will be dependent on a project being entered in Daptiv.
  • Provide weekly personal status reporting by COB Friday submitted on SharePoint.
  • Utilize a SharePoint site for project and operational documentation; review existing documentation.

The Architect can design develop and implement data and ELT application infrastructure in Azure to provide reliable and scalable applications and systems to meet the organizations objectives and requirements. The Architect is familiar with a variety of application and database technologies environments concepts methodologies practices and procedures

The candidate must have significant hands-on technical experience and expertise with Azure Azure Delta Lake Azure Databricks Azure Data Factory Pipelines Apache Spark and Python.

Significant hands-on technical experience and expertise with the design implementation and maintenance of business intelligence and data warehouse solutions with expertise in using SQL Server and Azure Synapse.

Experience producing ETL/ELT using SQL Server Integration Services and other tools.

Experience with SQL Server T-SQL scripts queries.

Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design implement and maintain robust and scalable CI/CD pipelines automate the build test and deployment processes for various applications and services troubleshoot and resolve pipeline issues and bottlenecks and has experience with Monorepo-based CI/CD pipelines

Experience with data formatting capture search retrieval extraction classification quality control cleansing and information filtering techniques.

Experience with data mining architecture modeling standards reporting and data analysis methodologies.

Experience with data engineering database file systems optimization APIs and analytics as a service.

Analyzing and translating business requirements and use cases into optimized designs and developing sound solutions.

Advanced knowledge of relational databases dimensional databases entity relationships data warehousing facts dimensions and star schema concepts and terminology.

Creates and maintains technical documentation diagrams flowcharts instructions manuals test plans and test cases. Follows established SDLC best practices documents code and participates in peer code reviews.

Ability to balance work between multiple projects and possess good organizational skills with minimal or no direct supervision.

Demonstrated ability to communicate and document clearly and concisely

Ability to work collaboratively and effectively with colleagues as a member of a team.

Ability to present complex technical concepts and data to a varied audience effectively.

More than 5 years of relevant experience.

4-year college degree in computer science or related field with advanced study preferred.



PREFERRED EXPERIENCE

Experience working in the public health or healthcare industry with various health data sets.

TIMEFRAMES

This will be a 10-month engagement beginning in September 2025

In addition DOH will supply all necessary hardware and software for daily use that are needed to complete assigned work items.

Required/Desired Skills Skill Required /Desired Amount of exp required in years Candidate Experience Technical experience and expertise with Azure Azure Delta Lake Azure Databricks Azure Data Factory Pipelines Apache Spark and Python. Required 5 Design implementation and maintenance of business intelligence and data warehouse solutions with expertise in using SQL ServerAzure Synapse Required 5 Experience producing ETL/ELT using SQL Server Integration Services and other tools. Required 5 Experience with SQL Server T-SQL scripts queries Required 5 Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design implement and maintain robust and scalable CI/CD pipelines Required 5 Experience with data formatting capture search retrieval extraction classification quality control cleansing and information filtering Required 5 Experience with data engineering database file systems optimization APIs and analytics as a service Required 5 Experience with data mining architecture modeling standards reporting and data analysis methodologies Required 5 4-year college degree in computer science or related field with advanced study preferred. Required 0 Questions No. Question Answer Question2 The candidate must reside in PA. Where does your candidate currently reside Question3 Does candidate possess more than 5 years as a Azura Data Warehouse Developer

Employment Type

Full-time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.