Key Responsibilities
- Design build and maintain scalable batch and real-time data pipelines.
- Develop and support RESTful and streaming APIs for event-driven data ingestion.
- Manage queuing and streaming systems to enable resilient low-latency processing.
- Design and operate streaming architectures with fault tolerance schema evolution and data quality controls.
- Build and maintain structured and unstructured databases (e.g. SQL Server BigQuery).
- Develop and optimise queries and data models to support analytics and reporting.
- Manage enterprise data warehouses and Lakehouse ensuring data integrity and governance.
- Perform data migration transformation and normalisation across systems.
- Collaborate with cross-functional teams to deliver data solutions that drive business insight.
- Monitor tune and troubleshoot data pipelines and databases continuously improving performance and reliability.
- Support the integration of AI and LLM-driven capabilities into the data platform ensuring data readiness quality and governance for future AI use cases.
Qualifications :
Required Skills & Experience
- Experience as a Data Engineer Data Analyst or similar role.
- Strong database expertise including performance tuning and optimisation.
- Hands-on experience with relational and analytical databases (e.g. MS SQL Server BigQuery).
- Experience building and managing data warehouses data lakes or Lakehouse architectures.
- Solid understanding of ETL pipelines and cloud-based data platforms.
- Experience integrating APIs with data systems.
- Strong Python skills for data engineering automation and transformation.
- Experience working with structured and semi-structured data (e.g. JSON).
- Understanding of data security governance and modern data architecture patterns (e.g. Medallion Kimball).
- Preparing and managing data for AI/LLM applications and machine learning use cases.
Nice to Have
- Experience with Databricks including structured streaming and pipelines.
- Familiarity with delta tables partitioning and performance optimisation.
- Experience with GraphQL for analytics or data access APIs.
- Knowledge of the hospitality or hotel industry particularly booking engines.
- Experience working across complex organisations with multiple stakeholders.
- Experience with Domo BI tool
About You
- A proactive problem solver who enjoys tackling complex data challenges.
- Curious experimental and eager to learn. Whilst comfortable iterating and learning from mistakes.
- Strong communicator able to explain technical concepts to non-technical stakeholders.
- Proven track record of designing building and evolving data warehouses and data lakes.
- Comfortable working in a fast-paced collaborative and evolving environment.
Additional Information :
Whats in it for you:
- 33 days holiday pension and life insurance
- A health cash plan to claim money back and get access to lots of ways to support your physical & mental wellbeing
- Treat yourself with lots of retail & hospitality perks through our partners
- Excellent discounts across the entire Ennismore family for you and your nearest and dearest (even after you leave!)
- Extra time off to volunteer with one of our partner charities
- Cycle to work scheme
- Opportunity to join an innovative fast-growing international hospitality group.
- The chance to help build a modern enterprise-scale data platform from the ground up.
- A creative collaborative and rewarding working environment.
- Be part of a passionate team focused on delivering exceptional hospitality experiences.
- Competitive package with strong opportunities for growth and development.
Remote Work :
No
Employment Type :
Full-time
Key ResponsibilitiesDesign build and maintain scalable batch and real-time data pipelines.Develop and support RESTful and streaming APIs for event-driven data ingestion.Manage queuing and streaming systems to enable resilient low-latency processing.Design and operate streaming architectures with fau...
Key Responsibilities
- Design build and maintain scalable batch and real-time data pipelines.
- Develop and support RESTful and streaming APIs for event-driven data ingestion.
- Manage queuing and streaming systems to enable resilient low-latency processing.
- Design and operate streaming architectures with fault tolerance schema evolution and data quality controls.
- Build and maintain structured and unstructured databases (e.g. SQL Server BigQuery).
- Develop and optimise queries and data models to support analytics and reporting.
- Manage enterprise data warehouses and Lakehouse ensuring data integrity and governance.
- Perform data migration transformation and normalisation across systems.
- Collaborate with cross-functional teams to deliver data solutions that drive business insight.
- Monitor tune and troubleshoot data pipelines and databases continuously improving performance and reliability.
- Support the integration of AI and LLM-driven capabilities into the data platform ensuring data readiness quality and governance for future AI use cases.
Qualifications :
Required Skills & Experience
- Experience as a Data Engineer Data Analyst or similar role.
- Strong database expertise including performance tuning and optimisation.
- Hands-on experience with relational and analytical databases (e.g. MS SQL Server BigQuery).
- Experience building and managing data warehouses data lakes or Lakehouse architectures.
- Solid understanding of ETL pipelines and cloud-based data platforms.
- Experience integrating APIs with data systems.
- Strong Python skills for data engineering automation and transformation.
- Experience working with structured and semi-structured data (e.g. JSON).
- Understanding of data security governance and modern data architecture patterns (e.g. Medallion Kimball).
- Preparing and managing data for AI/LLM applications and machine learning use cases.
Nice to Have
- Experience with Databricks including structured streaming and pipelines.
- Familiarity with delta tables partitioning and performance optimisation.
- Experience with GraphQL for analytics or data access APIs.
- Knowledge of the hospitality or hotel industry particularly booking engines.
- Experience working across complex organisations with multiple stakeholders.
- Experience with Domo BI tool
About You
- A proactive problem solver who enjoys tackling complex data challenges.
- Curious experimental and eager to learn. Whilst comfortable iterating and learning from mistakes.
- Strong communicator able to explain technical concepts to non-technical stakeholders.
- Proven track record of designing building and evolving data warehouses and data lakes.
- Comfortable working in a fast-paced collaborative and evolving environment.
Additional Information :
Whats in it for you:
- 33 days holiday pension and life insurance
- A health cash plan to claim money back and get access to lots of ways to support your physical & mental wellbeing
- Treat yourself with lots of retail & hospitality perks through our partners
- Excellent discounts across the entire Ennismore family for you and your nearest and dearest (even after you leave!)
- Extra time off to volunteer with one of our partner charities
- Cycle to work scheme
- Opportunity to join an innovative fast-growing international hospitality group.
- The chance to help build a modern enterprise-scale data platform from the ground up.
- A creative collaborative and rewarding working environment.
- Be part of a passionate team focused on delivering exceptional hospitality experiences.
- Competitive package with strong opportunities for growth and development.
Remote Work :
No
Employment Type :
Full-time
View more
View less