Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailPrometeo Talent is ranked as the #1 Recruitment Agency in America and Europe. We excel in connecting companies with exceptional technology and engineering professionals.
We are partnering with a venture-backed startup founded by repeat entrepreneurs and security veterans with successful exits. Backed by top-tier investors and advised by leaders from companies like Databricks Netflix Mozilla and Semgrep this startup is reimagining how secure software gets built.
They build AI-native security tools that automate security design reviews and threat modeling helping engineering teams identify and mitigate risks early without the usual meetings slides and spreadsheets. Their mission: help product teams ship secure software faster.
They are currently seeking a Data Engineer (AI track) in Europe - Poland Slovakia Czech Republic (Remote).
We are looking for a Data Engineer who is eager to grow into the world of AI. If you are curious driven and ready to expand from data infrastructure into machine learning and large language models (LLMs) this is your opportunity.
You will work closely with the founding team to design systems that extract transform and model complex codebases into graph-based architectural representations. Over time you will help push the boundaries in semantic code understanding inference models and secure-by-design AI tooling.
- Work with the founders to build systems that process and model code.
- Create and manage data pipelines in the cloud (AWS preferred).
- Use SQL PostgreSQL and vector databases to handle data.
- Write Python code including with asyncio.
- Help develop tools for code understanding and AI security.
- Collaborate with engineers and product managers to deliver features.
5 years of experience building scalable data systems or data-intensive applications.
Advanced Python proficiency especially with asyncio.
Experience designing and deploying data pipelines in cloud environments (AWS).
Strong knowledge of SQL (PostgreSQL) and vector databases (for example pgvector).
Experience with AI model APIs and embedding-based retrieval.
Understanding of performance caching and orchestration in large data systems.
Hands-on experience with Docker and Terraform.
Familiarity with CI/CD pipelines testing and agile workflows.
- Knowledge of LLM concepts: fine-tuning prompt engineering hallucination reduction.
- Exposure to semantic code analysis AST parsing or graph-based code modeling.
- Familiarity with serverless architectures (AWS Lambda).
- Contributions to open-source AI or data infrastructure projects.
Competitive pay and great benefits.
100% remote work.
Budget to set up your home office.
Opportunity to work at the frontier of AI code analysis and secure-by-design systems.
Team-building company trips once or twice a year.
Full Time