Job Title: Azure Databricks Engineer
Location: Triangle region of North Carolina
Job Role: Plan and design ETL pipelines and product solutions using Azure Databricks.
Responsibilities:
Create resilient processes to ingest data from a variety of onprem and cloud transactional databases and APIs.
Develop business requirements facilitate change management documentation and collaborate with stakeholders.
Work closely with a development technical lead and discuss design and planning with the development team.
Research and engineer repeatable and resilient ETL workflows using Databricks notebooks and Delta Live Tables for both batch and stream processing.
Collaborate with business users to develop data products that align with business domain expectations.
Work with DBAs to ingest data from cloud and onprem transactional databases.
Contribute to the development of the Data Architecture for NC DIT Transportation.
Required Skills:
Excellent interpersonal skills written and communication skills.
Ability to write clean easytofollow Databricks notebook code.
Deep knowledge of data engineering best practices data warehouses data lakes and the Delta Lake architecture.
Good knowledge of Spark and Databricks SQL/PySpark.
Technical experience with Azure Databricks and cloud providers like AWS Google Cloud or Azure.
Indepth knowledge of OLTP and OLAP systems Apache Spark and streaming products like Azure Service Bus.
Practical experience with Databricks Delta Live Tables.
Desired Skills: Knowledge of objectoriented languages like C# Java or Python.
Work Arrangement: Hybrid model with occasional need to be onsite at customer offices.
Notice: Absences greater than two weeks must be approved by management in advance.