Job description
About Us
DeepSea was founded to bring the latest AI technology to make the world of shipping leaner greener and better connected. Our cloud platforms harness deep learning models trained on real-time data to enable our customers to optimise the routes speeds operation and maintenance of their vessels saving fuel money time and reducing environmental impact. DeepSea talent is spread across Greece the United Kingdom the Netherlands Romania Thailand and Japan and we are growing quickly.
We are a scale-up company - meaning you will be bringing us to the next level and deliver a tangible contribution to a more sustainable shipping world. There is no complex hierarchy nor long approval processes. We just deliver. You will be joining a company with highly skilled and smart people who love solving complex problems.
Main Responsibilities
Design build and maintain the data infrastructure and pipelines to support AI model training and analytics.
Collaborate with a team of software architects software engineers AI engineers and data quality engineers to translate technical and business requirements into Databricks solutions based on established best practices.
Work seamlessly in a complex cross-disciplinary environment and deliver solutions in a fast paced environment.
Facilitate a culture of peer review and constructive feedback to enhance the quality of solutions.
Strong understanding of data security compliance and governance including data privacy regulations secure data handling auditability and enterprise-grade data management.
Job requirements
Main Requirements
Bachelors or Masters degree in Engineering Mathematics Physics Computer Science or related field.
Proven expertise in designing developing and rolling out production scalable data processing systems for ML/AI and analytics workloads preferably using Databricks or similar.
Strong understanding of relational and non-relational database design including normalization indexing and partitioning strategies.
Expertise in schema modeling for structured and semi-structured data (e.g. Delta Lake Parquet JSON).
Hands-on expertise with Databricks or similar: Spark-based data engineering Delta Lake warehouses ETL/ELT data integration MLflow integration collaborative notebooks and scalable model deployment.
Ability to optimize query performance using techniques like caching bucketing and predicate pushdown.
Solid grasp of data warehousing concepts and performance tuning in cloud environments (e.g. Azure AWS).
Experience in mentoring cross-functional teams.
Programming skills in Python SQL Pandas NumPy SciPy Spark or similar frameworks.
Familiarity with MLOps practices pipeline automation model versioning and model CI/CD.
Strong communication and collaboration skills in an agile dynamic and cross-disciplinary environment.
Nice to Have
Experience in Databricks Structured Streaming.
Experience in using Kafka and Flink or similar.
Familiarity with Databricks on AWS including integration with S3 APIs and secure data access patterns.
Knowledge of Dimensional Modeling (Star/Snowflake schemas) for BI and reporting use cases.
Familiarity with Unity Catalog for managing data assets and permissions across workspaces.
Understanding of streaming data architectures using Structured Streaming or Delta Live Tables.
Experience with data observability tools (e.g. Monte Carlo Databand) for monitoring pipeline health.
What we offer:
You will be part of a dynamic team focused on delivering results and continuous improvement while disrupting the industry and will also receive:
Competitive remuneration package: Skill & experience-based salary and eligibility for additional employee benefits.
Health Package: Private health insurance coverage and mental health - therapist benefit
Paid leaves: emergency and medical leaves.
Learning and Development Package: Access to courses platform and eligibility for seminars conferences and workshops
Remote Flexibility: Great office space in the heart of Athens with hybrid option.
At DeepSea we are looking for people who share our values and are aligned with our mission. It is important to us to ensure that no-one who is eager and capable of contributing constructively to our team is excluded because of ethnic or social origin gender or sexuality age or family status disability or medical conditions etc. Diversity is well-proven to be a vital characteristic of teams that succeed so we do everything we can to make our environment welcoming and safe for everyone.
Data Privacy
The company ensures that the personal data of candidates is handled with care and in compliance with GDPR regulations. Your personal data will be stored securely and only for the duration necessary under the law. If the recruitment process is unsuccessful your data will be retained for 5 (five) years to consider you for future opportunities. After this period your data will be deleted or case you do not wish for the company to keep your CV and personal data please send an email to
All done!
Your application has been successfully submitted!
Job descriptionAbout UsDeepSea was founded to bring the latest AI technology to make the world of shipping leaner greener and better connected. Our cloud platforms harness deep learning models trained on real-time data to enable our customers to optimise the routes speeds operation and maintenance o...
Job description
About Us
DeepSea was founded to bring the latest AI technology to make the world of shipping leaner greener and better connected. Our cloud platforms harness deep learning models trained on real-time data to enable our customers to optimise the routes speeds operation and maintenance of their vessels saving fuel money time and reducing environmental impact. DeepSea talent is spread across Greece the United Kingdom the Netherlands Romania Thailand and Japan and we are growing quickly.
We are a scale-up company - meaning you will be bringing us to the next level and deliver a tangible contribution to a more sustainable shipping world. There is no complex hierarchy nor long approval processes. We just deliver. You will be joining a company with highly skilled and smart people who love solving complex problems.
Main Responsibilities
Design build and maintain the data infrastructure and pipelines to support AI model training and analytics.
Collaborate with a team of software architects software engineers AI engineers and data quality engineers to translate technical and business requirements into Databricks solutions based on established best practices.
Work seamlessly in a complex cross-disciplinary environment and deliver solutions in a fast paced environment.
Facilitate a culture of peer review and constructive feedback to enhance the quality of solutions.
Strong understanding of data security compliance and governance including data privacy regulations secure data handling auditability and enterprise-grade data management.
Job requirements
Main Requirements
Bachelors or Masters degree in Engineering Mathematics Physics Computer Science or related field.
Proven expertise in designing developing and rolling out production scalable data processing systems for ML/AI and analytics workloads preferably using Databricks or similar.
Strong understanding of relational and non-relational database design including normalization indexing and partitioning strategies.
Expertise in schema modeling for structured and semi-structured data (e.g. Delta Lake Parquet JSON).
Hands-on expertise with Databricks or similar: Spark-based data engineering Delta Lake warehouses ETL/ELT data integration MLflow integration collaborative notebooks and scalable model deployment.
Ability to optimize query performance using techniques like caching bucketing and predicate pushdown.
Solid grasp of data warehousing concepts and performance tuning in cloud environments (e.g. Azure AWS).
Experience in mentoring cross-functional teams.
Programming skills in Python SQL Pandas NumPy SciPy Spark or similar frameworks.
Familiarity with MLOps practices pipeline automation model versioning and model CI/CD.
Strong communication and collaboration skills in an agile dynamic and cross-disciplinary environment.
Nice to Have
Experience in Databricks Structured Streaming.
Experience in using Kafka and Flink or similar.
Familiarity with Databricks on AWS including integration with S3 APIs and secure data access patterns.
Knowledge of Dimensional Modeling (Star/Snowflake schemas) for BI and reporting use cases.
Familiarity with Unity Catalog for managing data assets and permissions across workspaces.
Understanding of streaming data architectures using Structured Streaming or Delta Live Tables.
Experience with data observability tools (e.g. Monte Carlo Databand) for monitoring pipeline health.
What we offer:
You will be part of a dynamic team focused on delivering results and continuous improvement while disrupting the industry and will also receive:
Competitive remuneration package: Skill & experience-based salary and eligibility for additional employee benefits.
Health Package: Private health insurance coverage and mental health - therapist benefit
Paid leaves: emergency and medical leaves.
Learning and Development Package: Access to courses platform and eligibility for seminars conferences and workshops
Remote Flexibility: Great office space in the heart of Athens with hybrid option.
At DeepSea we are looking for people who share our values and are aligned with our mission. It is important to us to ensure that no-one who is eager and capable of contributing constructively to our team is excluded because of ethnic or social origin gender or sexuality age or family status disability or medical conditions etc. Diversity is well-proven to be a vital characteristic of teams that succeed so we do everything we can to make our environment welcoming and safe for everyone.
Data Privacy
The company ensures that the personal data of candidates is handled with care and in compliance with GDPR regulations. Your personal data will be stored securely and only for the duration necessary under the law. If the recruitment process is unsuccessful your data will be retained for 5 (five) years to consider you for future opportunities. After this period your data will be deleted or case you do not wish for the company to keep your CV and personal data please send an email to
All done!
Your application has been successfully submitted!
View more
View less