drjobs L4 Data Management

L4 Data Management

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Toronto - Canada

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Description


Our team and what well accomplish together


Be a part of a transformational journey with us where innovative talent meets cutting-edge technologies. At TELUS the Data Center of Excellence team is dedicated to making TELUS the most insights-driven company globally. We provide business intelligence data assets data products business metrics and data Engineering that drive and enable BI and analytics to support over 15000 employees across diverse internal teams.


We are seeking a highly skilled and experienced Data Management Professional specializing in Google Cloud Platform (GCP) and Big Query(BQ) to join our dynamic team. The ideal candidate will have a strong background in engineering complex data assets and BI/Data solutions experience in designing and implementing automated CI/CD pipelines and test automation. This role will involve collaborating with cross-functional teams to architect implement and manage cloud-based data solutions while ensuring security compliance and optimal performance. As a senior team member you will provide mentorship support your colleagues and help build their expertise and capabilities


What youll do

  • Design and implement cloud-based data solutions on Google Cloud Platform (GCP) Big Query Snowflake and Postgres SQL to support business intelligence (BI) and data analytics requirements.
  • Define/ Establish best practices for automated regression testing approaches to ensure the reliability and accuracy of data solutions.
  • Establish robust Change Management and release management strategies to ensure smooth deployments and minimize downtime.
  • Establish the Software Development Lifecycle (SDLC) process and best practices for cloud-based projects including requirements gathering design development testing deployment maintenance and documentation.
  • Define and communicate the cloud vision and strategy aligning technical solutions with business objectives and future scalability.
  • Develop and maintain automated CI/CD pipelines for deploying data pipelines analytics models and applications.
  • Design and implement ETL (Extract Transform Load) processes and integration workflows to ingest process and transform data from various sources into usable formats for analysis and reporting.
  • Implement compliance best practices to protect data assets and ensure regulatory compliance (e.g. SOX).
  • Oversee configuration management processes to maintain consistency and scalability across cloud environments.
  • Implement API integration strategies to enable seamless communication and data exchange between systems and applications.
  • Design develop and optimize data models and explores within the Looker platform

Qualifications

What you bring

  • Bachelors degree in Computer Engineering Science or related field.
  • 10 years of experience SQL / PL SQL Stored procedure BigQuery Dataflow(data Pipeline tools) in building complex datasets and data engineering and data pipeline development
  • 5-7 years of experience in cloud architecture and data engineering with a focus on Google Cloud Platform (GCP)
  • 2 to 5 years experience with Pulumi CI/CD/ Terraform version control and IaC) infrastructure as code.
  • 2 to 5 years dashboard and visualization tools such as tableau Looker Looker studio.
  • Experience designing and implementing automated CI/CD pipelines using native GCP services and tools like Jenkins GitLab CI/CD or similar.
  • Proven track record of delivering complex BI/Data solutions in a cloud environment.
  • Strong understanding of regression testing methodologies and tools for data solutions.
  • Solid understanding of Change Management and release management principles and best practices.
  • Hands on experience with DevOps / SRE / automation
  • Experience with code repositories and version control systems (e.g. Git Bitbucket).
  • Excellent communication and interpersonal skills with the ability to collaborate effectively with cross-functional teams.

Great-to-haves

  • Cloud certification (GCP Professional Cloud Architect Professional Data Engineer AWS(Certified Solutions Architect. Certified Developer)
  • Data certifications (data engineer oracle or others)
  • Experience working with data visualization tools (e.g. Looker Looker Studio Tableau).
  • Familiarity with containerization and orchestration technologies (e.g. Docker Kubernetes).
  • Experience with Azure and AWS
  • Knowledge of security and compliance standards (e.g. SOX GDPR) and best practices for cloud environments.
  • Have designed data models for Online Transactional Processing (OLTP) and Online Analytical Processing (OLAP) warehousing ensuring optimal performance and scalability.

Join our team and be part of a dynamic environment where your expertise will make a significant impact on our success.

Employment Type

Full-Time

Company Industry

About Company

1000 employees
Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.