drjobs L4 Data Management (Data Engineer)

L4 Data Management (Data Engineer)

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Quebec - Canada

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Join our team and what well accomplish together

Lquipe Stratgie et mise en uvre de donnes a pour mission de faire de TELUS un chef de file mondial en matire de solutions de donnes. Nous sommes les architectes de lavenir et ce titre nous btissons les plateformes de donnes unifies et volutives et pavons la voie une culture axe sur les produits et les donnes qui alimentent les dcisions et linnovation dans toute lentreprise. Notre travail constitue le fondement sur lequel reposent lanalytique avance et lintelligence artificielle (IA) rvolutionnaire.


En tant quexpert ou experte en science des donnes au sein de notre quipe vous serez au cur de cette transformation. Vous appliquerez votre expertise des systmes de donnes de larchitecture et du dveloppement logiciel pour construire et dployer les pipelines de donnes robustes et de haute qualit qui alimentent nos activits. Vous collaborerez troitement avec les scientifiques des donnes les ingnieurs en apprentissage-machine les architectes de donnes et diverses parties prenantes de lentreprise pour transformer les problmes de donnes complexes en solutions de donnes fiables volutives et automatises.

What you will do

  • Partner with business and technology stakeholders to understand data requirements and translate them into technical designs for resilient high-performance data pipelines.
  • Lead the design development and deployment of sophisticated data ingestion transformation and delivery solutions using modern cloud-native technologies.
  • Develop and maintain robust scalable data pipelines for continuous and reliable data flow ensuring the quality and availability of data for machine learning models analytics and critical business operations.
  • Work with diverse and complex datasets engineering elegant solutions to extract model and prepare data for a variety of downstream use cases.
  • Architect and implement the transformation and modernization of our data solutions leveraging Google Cloud Platform (GCP) native services and other leading-edge data technologies.
  • Identify opportunities for and implement automation in the development and production lifecycle to improve efficiency reliability and speed.
  • Champion and implement best practices in software engineering data engineering and data management within the team.
  • Continuously assess the evolving data technology landscape and identify new opportunities to drive innovation and efficiency.

What you bring

  • A proven track record of designing building and deploying data pipelines and solutions that deliver tangible business value reflected in your 5 years of IT platform implementation experience.
  • Bachelors degree in Computer Science Engineering or an equivalent field.
  • Deep understanding and hands-on experience in data engineering principles and best practices with advanced knowledge of Dataflow Spark or Kafka.
  • Strong systems design experience with the ability to architect document and explain complex system interactions data flows and APIs.
  • Excellent communication skills with the ability to articulate complex data concepts and solutions clearly to diverse technical and non-technical audiences.
  • Advanced experience with Python and mastery of data manipulation libraries (like Pandas) paired with strong SQL skills for complex querying and data analysis.
  • Familiarity with at least one major cloud computing platform (GCP AWS Azure) with practical experience deploying data solutions.
  • Strong analytical and problem-solving skills with a talent for translating complex business needs into scalable and maintainable data solutions.
  • A passion for innovation a results-oriented mindset an agile approach with a bias for action and a change agent mentality.

Great-to-haves

  • GCP Professional Data Engineer certification.
  • Practical experience with Databricks Azure or AWS data services.
  • An understanding of telecommunications data models or TMF standards.
  • Experience in MLOps supporting the data infrastructure required for machine learning workflows.
  • Familiarity with Infrastructure as Code (e.g. Terraform) and CI/CD practices

Employment Type

Full-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.