Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailNot Disclosed
Salary Not Disclosed
1 Vacancy
We are seeking an experienced and driven Azure Data Engineer to design build and maintain data pipelines that serve as the backbone of our enterprise data this role you will be responsible for crafting reliable ETL/ELT workflows using Azure Data Factory Python and CData ensuring seamless integration and synchronization across critical systems such as Sage Paylocity and HubSpot.
You will work closely with cross-functional teams to build data flows that support operational efficiency reporting accuracy and scalability. This is an opportunity to have significant impact on high-volume high-visibility data systems in a growing environment.
Design develop and manage robust ETL/ELT pipelines using Azure Data Factory Python or orchestration tools like Airflow.
Create and maintain staging transformation production workflows in Azure SQL.
Develop logic for data validation deduplication and error handling to ensure pipeline reliability and trustworthiness.
Manage sync queues for critical systems including Sage (accounting) Paylocity (payroll) HubSpot (CRM) and others.
Configure and maintain middleware endpoints using CData or similar platforms for streamlined data flow between systems.
Write efficient SQL queries views stored procedures and scripts for analytics transformation and system integration.
Collaborate with stakeholders to design data solutions aligned with business processes and compliance standards.
3 years of experience in data engineering or related technical field.
Proficiency with Azure Data Factory Airflow or similar orchestration frameworks.
Strong command of SQL (preferably Azure SQL but experience with other RDBMS accepted).
Hands-on experience with Python scripting for ETL logic and transformations.
Experience with CData or equivalent data connectivity/middleware tools.
Deep understanding of data validation lineage and audit trail frameworks.
Familiarity with ingesting data from various formats (e.g. CSV Excel) and APIs.
Working on high-volume data workflows involving job-costing payroll ERP or CRM systems.
Experience building resilient data sync frameworks with retry logic circuit breaking and error monitoring.
Familiarity with role-based data governance logging and compliance standards in enterprise environments.
Prior experience in cloud-native architecture and performance optimization.
Reliable and scalable data pipelines supporting real-time and batch workloads.
Accurate and timely synchronization of critical business systems.
High data integrity and auditability across workflows.
Clear documentation and reusability of data infrastructure components.
Full Time