Azure Data Engineer

Not Interested
Bookmark
Report This Job

profile Job Location:

Bow Island - Canada

profile Monthly Salary: CAD 10 - 10
profile Experience Required: 5years
Posted on: 30+ days ago
Vacancies: 1 Vacancy

Job Summary

Job Description:

  • Key Responsibilities Develop and maintain ETLELT pipelines using Azure Data Factory (ADF) and Azure Databricks.
  • Implement data ingestion flows from diverse sources in-cluding Azure Blob Storage Azure Data Lake On-Prem SQL and SFTP. Design and optimize data models and transformations using Oracle Spark SQL PySpark SQL Server Progress DB SQL.
  • Build orchestration workflows in ADF using activities like Lookup For Each Execute Pipe-line and Set Variable. Perform root cause analysis and resolve production issues in pipelines and notebooks.
  • Collaborate on CICD pipeline creation using Azure DevOps Jenkins.
  • Apply per-formance tuning techniques to Azure Synapse Analytics and SQL DW.
  • Maintain documentation including runbooks technical design specs and QA test cases Data Pipeline Engineering De-sign and implement scalable fault-tolerant data pipelines using Azure Synapse and Data-bricks. Ingest data from diverse sources including flat files DB2 NoSQL and cloud-native formats (CSV JSON).
Technical Skills Required Cloud Platforms Azure (ADF ADLS ADB Azure SQL Synapse Cosmos DB) ETL Tools Azure Data Factory Azure Databricks Programming SQL PySpark Spark SQL DevOps Automation Azure DevOps Git CICD Jenkins


Required Skills:

Hands-on working experience architecting Guidewire ClaimCenter solutions including customization and integration. Guidewire certification is a technologies of interest: Guidewire Cloud Salesforce CRM legacy modernization and AWS. Proven knowledge & architecture experience in architecture (digital / digital marketing / micro / macro / monolithic services APIs) application integration service-oriented architecture event-driven architecture application architecture distributed architecture data architecture and experience with modelling languages & techniques. Can quickly comprehend the functions and capabilities of new technologies. Can understand the long-term (big picture) and short-term perspectives of situations. Strong technical background (platforms languages protocols frameworks open source etc.).Experience with architecture frameworks (TOGAF) & architecture certifications a in engaging and supporting claims teams and understanding their day-to-day operations in the P&C insurance space. Open and clear connect with the business telecom infrastructure security audit vendors and software engineering. Driven by challenges and proactive and a motivation for on security standard methodologies and understand the impacts it can have on a working in a constantly evolving technological excellent teammate who demonstrates leadership. Comfortable speaking with all levels of the organization and different audiences.

Job Description: Key Responsibilities Develop and maintain ETLELT pipelines using Azure Data Factory (ADF) and Azure Databricks. Implement data ingestion flows from diverse sources in-cluding Azure Blob Storage Azure Data Lake On-Prem SQL and SFTP. Design and optimize data models and transformations...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala