drjobs Lead AI Platform Engineer

Lead AI Platform Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bengaluru - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Lead Databricks Data Engineer

Ecolab is looking for a Data Engineer to be part of a dynamic team thats at the forefront of technological innovation. Were leveraging cutting-edge AI to create novel solutions that optimize operations for our clients particularly within the restaurant industry. Our work is transforming how restaurants operate making them more efficient and sustainable.

As a key player in our new division youll have the unique opportunity to help shape its culture and direction. Your contributions will directly impact the success of our innovative projects and help define the future of our product you will experience the best of both worlds with this team at Ecolab: the agility and creativity of a startup paired with the stability and resources of a global leader. Our collaborative environment fosters innovation while providing the support and security you need to thrive.

Responsibilities

  • Design develop and maintain scalable and robust data pipelines on Databricks (Spark SQL PySpark Delta Lake).
  • Collaborate with data scientists and analysts to understand data requirements and deliver solutions.
  • Optimize and troubleshoot existing data pipelines for performance and reliability.
  • Ensure data quality and integrity across various data sources.
  • Implement data security and compliance best practices.
  • Monitor data pipeline performance implement data quality checks and conduct necessary maintenance and updates.
  • Document data pipeline processes and technical specifications.
  • Implement robust pipeline orchestration using tools like Databricks Workflows dbt or similar.
  • Generate and maintain data quality reports and dashboards.
  • Implement Infrastructure as Code (IaC) principles for managing Databricks infrastructure.

Minimum Qualifications

  • Bachelors degree and 8 years work experience; or no degree and 12 years combined education and equivalent work experience.
  • 3 years of experience (work or educational) with a data engineering focus.
  • Proven experience in Databricks (Delta Lake Workflows Asset Bundles).
  • Proven experience in distributed data processing technologies (Spark SQL PySpark).
  • Strong knowledge in designing and developing ETL pipelines.
  • Experience with data quality monitoring and reporting.
  • Experience working in a collaborative environment with data scientists and software engineers.

Preferred Qualifications

  • Masters degree (MS) in Computer Science or related engineering field.
  • Proficiency in Databricks (Delta Lake Workflows Asset Bundles).
  • Proficiency in distributed data processing technologies (Spark SQL PySpark).
  • Experience with pipeline orchestration tools (Databricks Workflows dbt etc.).
  • Experience with data visualization tools (e.g. Tableau Power BI).
  • Working experience with machine learning platforms and tools.
  • Experience with real-time data streaming technologies (e.g. Kafka Kinesis).

Employment Type

Full-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.