Cloud Platform Engineer
Prague - Czech Republic
Job Summary
Prague I Brno
Part-Time I Full-Time
Are you passionate about building robust scalable cloud infrastructure that powers the next generation of data-driven applications Do you thrive on automating everything through code and ensuring seamless delivery pipelines We are looking for a Cloud Platform Engineer who bridges the gap between infrastructure and data engineering. If you have an ownership mindset love solving complex platform puzzles and want to work in a high-impact cloud-native environment wed love to meet you.
What Youll Do
As a Cloud Platform Engineer at DataSentics you will be the backbone of our technical delivery ensuring our data and AI solutions run on world-class infrastructure. You will work closely with data engineers and scientists to build stable secure and automated environments. Your core responsibilities will include:
- Architecting the Foundation: Design and implement cloud infrastructure (primarily AWS/Azure) using Infrastructure as Code (Terraform).
- Automating Delivery: Build and maintain robust CI/CD pipelines (GitLab/GitHub) to ensure frequent and reliable deployments.
- Platform Enablement: Own and solve platform-related topics around Databricks including Unity Catalog integration permissions and service principal management.
- Containerization: Manage and orchestrate services using Docker and Kubernetes (or Azure Container Apps) to host internal data applications.
- Data Ops & Security: Set up monitoring logging and alerting systems while ensuring top-tier security through managed identities secrets management and access control.
Why Join Us
At DataSentics we are a team of tech enthusiasts who believe that great data science is only possible with great engineering. We combine deep technical expertise with a practical product-driven approach. Youll be part of a culture that values technical excellence continuous learning and the freedom to suggest and implement the best tools for the job.
Our Projects & Technology Stack
We build enterprise-grade data platforms and internal applications across industries like finance retail and manufacturing.
- Cloud Ecosystems: Deep dives into AWS and Azure.
- Data Orchestration: Advanced environments involving Databricks job clusters complex pipelines and automated permissions.
- Modern DevOps: Heavy use of Terraform Docker and Git-based automation.
- How we work: Agile teams (25 people) remote-first culture and a focus on building it right the first time.
- Flexibility: Work from anywhere with optional offices in Prague and Brno; flexible roles starting at 30 hours/week.
What Were Looking For
Must-haves:
- Cloud Proficiency: Experience with cloud platforms ideally AWS/Azure.
- Automation Expert: Proven track record with CI/CD (GitLab/GitHub pipelines) and Terraform.
- Container Skills: Hands-on experience with Docker (Kubernetes or Azure Container Apps is a plus).
- Platform Troubleshooting: Ability to solve complex integrations involving Databricks and data applications.
- Mindset: A strong sense of ownership independence and the investigative skills needed to fix broken systems.
Nice-to-haves:
- Familiarity with Databricks administration (Unity Catalog permissions service principals).
- Basic to intermediate Python skills for automation and scripting.
- Understanding of Data Engineering lifecycles (pipelines orchestration storage).
- Experience with cloud security (Secrets IAM) and monitoring/alerting tools.
What Youll Get
- The opportunity to shape the infrastructure of cutting-edge AI and Data products.
- A collaborative environment focused on innovation and technical excellence.
- Flexibility remote work options and a supportive curious team culture.
Curious to know more If youre excited about infrastructure as code and want to empower data teams to build amazing things lets talk!
Required Experience:
IC
About Company
We solve business challenges for large organizations by building end-to-end AI solutions, industry-specific accelerators and harnessing the latest technologies, including MLOps and generative AI.