drjobs Data Engineering Technical Lead

Data Engineering Technical Lead

Employer Active

1 Vacancy
The job posting is outdated and position may be filled
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Toronto - Canada

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

The successful candidate will be part of a team which will be responsible for the correct timely and efficient flow of data from various source systems to various end points.

You Will:

  • Develop data flows based on the requirement by executing the necessary steps of data transformation data validation and mapping using nifi SQL and Python.
  • Develop necessary components to establish connection with various sources of data and destinations via API SQL S3 SFTP and any other communication protocol.
  • Maintain and apply changes on existing flows running on nifi ETL tool.
  • Perform thorough testing and validation to support the accuracy of data transformations data verification and delivery.
  • Develop unitest and regression tests.
  • Prepare release plans and coordinate with the stakeholders and release engineers to ensure the proper .
  • Provide postimplementation support and troubleshoot any potential issues.
  • Provide 2nd and 3rd level support to operation teams and resolve issues in a timely manner.
  • Prepare documentation of existing and new data flows as well as clients procedures.

#LIHybrid


Qualifications :

You Have:

  • Bachelors degree in Computer Science or a related technical field or equivalent experience.
  • 5 years of commercial experience ETL tools (experience with Apache NiFi will be considered an advantage).
  • Experience in datarelated process controls.
  • Experience working with a team of developers using agile methodologies to achieve planned process deliverables.
  • Experience with at least one of the following i.e. Amazon Redshift Snowflake or Databricks
  • Extensive experience with writing Python and Pyspark specifically for; writing complex data ingestion pipelines developing ETL/ELT jobs and implementing distributed data processing solutions
  • Experience with CICD i.e. implementing continuous integration workflows using tools like GitHub Actions GitLab CI or Jenkins
  • High level of accuracy and attention to detail.

Preferred:

  • Experience with programming languages such as Python will be considered as advantage.
  • Experience with Apache Kafka or AWS Kinesis is desirable


Additional Information :

Whats in it for you to join MUFG Investor Services  

Take a look at our careers site and youll find everything youd expect from a career with the fastestgrowing business at one of the worlds largest financial groups. Now take another look. Because its how we defy expectations that really defines us. Youll feel that difference in all kinds of ways. Our vibrant CULTURE. Connected team. Love of innovation laser client focus and nextlevel LEARNING & DEVELOPMENT. Oh and we really walk the talk when it comes to HYBRID WORKING.  

So why settle for the ordinary Apply now for a Brilliantly Different career.  

We thank all candidates for applying; however only those proceeding to the interview stage will be contacted.

We are an equal opportunity employer.

 


Remote Work :

No


Employment Type :

Fulltime

Employment Type

Full-time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.