Job Description
Job Description
- Responsible for ingestion of data sources and feature engineering to support creation of modelling data layers integrate data into models / workflows.
-
- Solution design implement and maintain data pipelines for data ingestion processing and transformation using could services and other technologies
- Create and maintain data solutions IN Azure using any of the following: Data Factory Synapse Fabric
- Implementing data validation and cleansing procedures will ensure the quality integrity and dependability of the data.
- Improve the scalability efficiency and cost-effectiveness of data pipelines.
- Monitoring and resolving data pipeline problems ensuring consistency and availability of the data.
Qualifications
- Good Python skills
- Knowledge of good data design principles (concepts such as Kimball and Star Schema) data warehousing concepts and data modelling.
- Strong experience with Relational and ideally NoSQL databases.
- Strong understanding of ETL/ELT processes.
- Exposure to data monitoring and observability tools
Additional Information :
At Endava were committed to creating an open inclusive and respectful environment where everyone feels safe valued and empowered to be their best. We welcome applications from people of all backgrounds experiences and perspectivesbecause we know that inclusive teams help us deliver smarter more innovative solutions for our customers. Hiring decisions are based on merit skills qualifications and potential. If you need adjustments or support during the recruitment process please let us know.
Remote Work :
No
Employment Type :
Full-time
Job DescriptionJob DescriptionResponsible for ingestion of data sources and feature engineering to support creation of modelling data layers integrate data into models / workflows. Solution design implement and maintain data pipelines for data ingestion processing and transformation using could serv...
Job Description
Job Description
- Responsible for ingestion of data sources and feature engineering to support creation of modelling data layers integrate data into models / workflows.
-
- Solution design implement and maintain data pipelines for data ingestion processing and transformation using could services and other technologies
- Create and maintain data solutions IN Azure using any of the following: Data Factory Synapse Fabric
- Implementing data validation and cleansing procedures will ensure the quality integrity and dependability of the data.
- Improve the scalability efficiency and cost-effectiveness of data pipelines.
- Monitoring and resolving data pipeline problems ensuring consistency and availability of the data.
Qualifications
- Good Python skills
- Knowledge of good data design principles (concepts such as Kimball and Star Schema) data warehousing concepts and data modelling.
- Strong experience with Relational and ideally NoSQL databases.
- Strong understanding of ETL/ELT processes.
- Exposure to data monitoring and observability tools
Additional Information :
At Endava were committed to creating an open inclusive and respectful environment where everyone feels safe valued and empowered to be their best. We welcome applications from people of all backgrounds experiences and perspectivesbecause we know that inclusive teams help us deliver smarter more innovative solutions for our customers. Hiring decisions are based on merit skills qualifications and potential. If you need adjustments or support during the recruitment process please let us know.
Remote Work :
No
Employment Type :
Full-time
View more
View less