Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailTitle : Azure Data Engineer
Location : Dallas TX
Rate : On C2C
Job Summary:The Azure Data Engineer (Senior) is a highly experienced role responsible for leading the design development and maintenance of large-scale complex data processing systems on the Azure platform. They work with data architects data scientists and other stakeholders to ensure data solutions are scalable reliable and secure.
Job Description:
Job Responsibilities
Design and implement Data Ingestion from multiple sources to Azure Data Storage services.
Implement Azure data services and tools to ingest egress and transform data from multiple sources.
Responsible for creating an ETL pipeline with Azure Ecosystem like Azure Data Bricks Azure Data Factory.
Build simple to complex pipelines activities Datasets & data flows
Utilize Azure compute services Databricks Data Lake Store PySpark Apache Spark Synapse Data Factory to implement transformation logic and stage transformed data.
Design data ingestion into data modelling services to create cross domain data models for end user consumption.
Implement ETL related jobs to curate transform and aggregate data to create source models for end user analytics use cases.
Scheduling automation and monitoring instrumentation for data movement jobs.
Working experience with Azure monitor and Azure log Analytics.
Background work on legacy data warehouses and Big Data will be plus.
Should have fair knowledge of the consumption layer (BI) and the business processes.
Implement/Support Azure DBaaS infrastructure and services
Experience working in Agile/Scrum/Kanban team environments
Candidate Profile / Qualification
A Minimum of a Bachelors degree in Computer Science or related software engineering discipline or equivalent
7-9 years of experience managing SQL Queries Data Lake Data Bricks PySpark Synapse Data factory etc.
Strong ETL pipeline development experience with Azure Ecosystem like Azure Data Bricks Azure Data Factory. Build simple to complex notebooks pipelines activities Datasets & data flows.
Experience with message ingestion systems like Kafka / Azure Event Hub.
Experience in data modelling and Proficient in SQL developer skills in writing stored procedures functions transformations etc.
Experience in PySpark and data factory is must
Knowledge in Service provisioning scripted provisioning blueprint development for data service deployment etc.
Excellent oral and written communication skills.
Candidate should possess a strong work ethic good interpersonal and communication skills and a high energy level
Full-time