Create and manage scalable data pipelines to collect process and store large volumes of data from multiple sources
Integrate data from diverse systems while ensuring consistency quality and reliability
Design implement and optimize database schemas and data structures to support efficient storage and retrieval
Develop maintain and enhance ETL (Extract Transform Load) processes to move data accurately and efficiently between systems
Qualifications :
Bachelors or Masters degree in Computer Science Information Technology Engineering Mathematics or a related field
5 years of hands-on experience with SQL SSIS and SSAS
2 years of experience working with Azure cloud services especially SQL Server Azure Data Factory (ADF) Azure Databricks ADLS Key Vault Azure Functions and Logic Apps (strong focus on Databricks)
2 years of experience using Git for version control and deploying code through CI/CD pipelines
Remote Work :
No
Employment Type :
Full-time
Founded in 1997, DataArt is a global software engineering firm and a trusted technology collaborator for market leaders and visionaries. Guided by the People-first principle, our world-class team designs and engineers data-driven, cloud-native solutions that foster progress and delive ... View more