DevOps Engineer 1

Inetum

Not Interested
Bookmark
Report This Job

profile Job Location:

Warsaw - Poland

profile Monthly Salary: Not Disclosed
Posted on: 21 hours ago
Vacancies: 1 Vacancy

Job Summary

We are seeking a DevOps Engineer to set up configure and operationalize a new Databricks environment with a primary focus on business intelligence (BI) analytics and data engineering workflows.

Working closely with our ML Ops Engineers you will ensure Databricks is fully prepared for both traditional BI/data processing use cases and AI workloads - including secure access for data analysts seamless integration with downstream AI and BI tools and optimized data pipelines.

Key Responsibilities

Environment Setup & Configuration

  • Deploy and configure Databricks workspace(s) for multi-team usage.

  • Manage shared clusters automated job clusters and interactive clusters.

  • Configure role-based permissions aligned with governance policies.

Data Integration Enablement

  • Establish secure connections to on-prem and cloud data sources (SQL Server Data Lake APIs).

  • Build shared ingestion pipelines for BI and analytics teams.

  • Automate daily/weekly data refresh schedules.

Connectivity for BI Tools

  • Integrate Databricks with BI platforms (e.g. Power BI).

  • Optimize query connectors and JDBC/ODBC configurations.

Operational Excellence

  • Implement monitoring and logging for jobs and pipelines.

  • Define backup and disaster recovery processes.

  • Apply cost tracking and optimization practices for cluster usage.

Automation & CI/CD

  • Set up CI/CD pipelines for data engineering code.

  • Manage deployment workflows for notebooks SQL queries and data models.

Collaboration

  • Partner with ML Ops Engineers to align infrastructure for ML and BI use cases.

  • Work with Data Engineers to maintain central data sources.

  • Collaborate with security teams to enforce access controls.

Governance

  • Enforce GDPR and internal compliance rules.

  • Maintain workspace auditing and logging.

  • Document environment setup and operational procedures.


Qualifications :

  • Proven experience in DevOps / Data Platform operations with cloud-based tools.

  • Strong experience in Databricks environment administration.

  • Proficiency in Python for automation.

  • Familiarity with BI tool integration via Databricks connectors.

  • Solid knowledge of SQL and data engineering fundamentals.

  • Experience with orchestration/scheduling tools (Databricks Workflows Airflow Azure Data Factory).

  • Understanding of Identity & Access Management in cloud environments.

Preferred Qualifications

  • Experience in multi-team Databricks environments with ML and BI workloads.

  • Familiarity with Azure Databricks ecosystem (Azure Data Lake Synapse Analytics).

  • Exposure to Infrastructure-as-Code tooling (Terraform ARM templates).

  • Performance tuning for BI queries on large datasets.

  • Knowledge of Delta Lake data architecture.

Must have:

  • Terraform

  • Python (automation)

  • English (B2) & Polish (B1) language

Nice to have:

  • Databricks environment administration (ideally on Azure).


Additional Information :

Hybrid work model: 3 days from the office 2 days of home office.

Office locations: Warszawa Lublin Poznań


Remote Work :

No


Employment Type :

Full-time

We are seeking a DevOps Engineer to set up configure and operationalize a new Databricks environment with a primary focus on business intelligence (BI) analytics and data engineering workflows.Working closely with our ML Ops Engineers you will ensure Databricks is fully prepared for both traditional...
View more view more

Key Skills

  • ASP.NET
  • Health Education
  • Fashion Designing
  • Fiber
  • Investigation

About Company

Company Logo

Inetum is a European leader in digital services. Inetum’s team of 28,000 consultants and specialists strive every day to make a digital impact for businesses, public sector entities and society. Inetum’s solutions aim at contributing to its clients’ performance and innovation as well ... View more

View Profile View Profile