AI & Data Engineer [Interim]

Riverflex

Not Interested
Bookmark
Report This Job

profile Job Location:

Rotterdam - Netherlands

profile Monthly Salary: Not Disclosed
Posted on: 18 hours ago
Vacancies: 1 Vacancy

Job Summary

Job description

Job Title: AI & Data Engineer Interim
Location: (Hybrid) Rotterdam
Engagement: Full-time 5 days a week

Contract duration: 3-6 months (possibe extension)

Start date: ASAP

Context

A large international company in the marine engineering / offshore construction sector is scaling production-grade AI across the organization. The Data & AI Platform team is expanding delivery capacity and is looking for an interim Data & AI Engineer who can step in immediately take ownership of AI use cases end-to-end and accelerate adoption across business teams on an Azure Databricks foundation.

This is a hands-on delivery role with high autonomy: you will build ship operate and improve AI solutions in productionwhile introducing reusable components and pragmatic best practices that raise the bar across the platform.

What youll do

  • Deliver AI use cases end-to-end: from ingestion and feature engineering to model/agent development and production rollout.

  • Design and operate Databricks lakehouse pipelines (batch and streaming) using Spark/SQL/Delta Lake including monitoring and data quality controls.

  • Build AI solutions on the platform including:

    • RAG patterns (retrieval chunking embeddings evaluation)

    • tool-using agents and orchestration approaches

    • prompt strategies and testing/guardrails

    • (where relevant) custom ML models and supporting pipelines

  • Productionize and run what you build: reliability observability cost control and operational hygiene.

  • Enable other teams by creating reusable components templates and delivery standards.

  • Work with governance and compliance: align with AI governance requirements and ensure solutions are secure and auditable.

  • Collaborate with stakeholders across IT and the business to translate needs into working solutions and clear delivery increments.

What success looks like (first weeks)

  • Rapidly understand the current Azure/Databricks landscape and delivery priorities.

  • Pick up 12 active use cases and move them toward production-quality standards.

  • Strengthen delivery patterns (templates evaluation approach monitoring data quality checks).

  • Create momentum with visible working increments and pragmatic documentation.

Required experience

  • Proven experience as a Data Engineer / Data & AI Engineer delivering solutions into production environments.

  • Strong hands-on Databricks expertise: Spark/SQL Delta Lake Jobs/Workflows performance tuning.

  • Strong Python SQL for data engineering and AI/ML workflows.

  • Experience building data pipelines with quality checks and operational monitoring.

  • Practical experience with LLM-based solutions (RAG and/or agents) including prompt strategies and evaluation approaches.

  • Comfortable working independently in an interim context: you can own delivery communicate clearly and unblock yourself.

Nice to have

  • Azure services exposure (e.g. Azure ML Azure OpenAI Key Vault Functions ADF).

  • LLM toolkits (LangChain Semantic Kernel) prompt evaluation frameworks early LLMOps patterns.

  • CI/CD (GitHub Actions) and Infrastructure-as-Code (Terraform).

  • ML frameworks (PyTorch TensorFlow scikit-learn) where needed.

Why this assignment

  • Immediate impact: deliver AI use cases into production on a modern Azure Databricks platform.

  • High ownership and autonomy: a true interim role where delivery outcomes matter.

  • Real-world relevance: projects tied to large-scale operations in a complex safety- and compliance-aware environment.

All done!

Your application has been successfully submitted!


Required Experience:

IC

Job descriptionJob Title: AI & Data Engineer InterimLocation: (Hybrid) RotterdamEngagement: Full-time 5 days a weekContract duration: 3-6 months (possibe extension)Start date: ASAPContextA large international company in the marine engineering / offshore construction sector is scaling production-grad...
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Company Logo

Build your digital, data and technology capabilities with us

View Profile View Profile