drjobs Data Engineer – AI & Analytics Pipeline

Data Engineer – AI & Analytics Pipeline

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Capital Department - Argentina

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

The Role:

We are seeking a motivated and intellectually curious Data Engineer to join our growing Data Science and Solutions team. This role is ideal for someone passionate about AI data integration and building modern data infrastructure. You will play a key role in scaling and optimizing our AI and analytics platform by developing robust secure and scalable data pipelines in Databricks on AWS.

Youll collaborate closely with AI/ML experts backend and frontend engineers and product stakeholders to transform data into impactful insights and intelligent user experiences. If youre eager to work in a dynamic remote-first environment where your contributions directly influence real-world outcomes we want to hear from you.

    Responsibilities:

    Data Pipeline Development

    - Design build and maintain ETL/ELT pipelines in Databricks to ingest clean and transform data from a variety of sources.

    - Develop gold layer tables in a Lakehouse architecture to support machine learning models and real-time APIs.

    - Monitor data quality lineage and reliability leveraging Databricks best practices and observability tools.

      AI-Driven Data Access Enablement

      - Collaborate with AI/ML teams to structure and model data for natural language prompts semantic retrieval and vector search using Unity Catalog metadata.

      - Contribute to the development of data interfaces and agent tools for secure role-based access to structured and unstructured data.

        API & Serverless Backend Integration

        - Partner with backend engineers to create serverless APIs (e.g. AWS Lambda TypeScript) that expose curated data for front-end applications.

        - Implement scalable secure and performant APIs with a strong focus on data governance and compliance.

        - Develop infrastructure-as-code and monitoring frameworks to support multi-tenant scaling of pipelines and AI endpoints.


          Requirements:

          • 3 years of experience as a Data Engineer or similar role in agile distributed environments.
          • Hands-on expertise with Databricks including workflow orchestration CDC and medallion architecture.
          • Strong skills in Spark or Scala for data wrangling and transformation across complex datasets.
          • Experience with CI/CD pipelines test-driven development and understanding of MLOps/AIOps best practices.
          • Proven ability to collaborate effectively with cross-functional teams including product managers engineers and data scientists.
          Preferred Skills

          Experience with AWS Lambda ( and API Gateway or other serverless frameworks.

          Understanding of API design principles and familiarity with RESTful and/or GraphQL endpoints.

          Exposure to React-based frontend architectures and awareness of how backend data delivery impacts UI/UX performance.

          Experience with A/B testing experimentation frameworks and logging for model inference and user analytics.


            C - NV -

            Wakapi Web

            Employment Type

            Full Time

            About Company

            Report This Job
            Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.