We are seeking a highly motivated and experienced Senior DevOps Engineer to join our growing team. You will play a pivotal role in automating and streamlining our data infrastructure ensuring reliable and efficient data pipelines built with Airflow Python and DBT. This role demands both technical expertise in these tools and a passion for DevOps :Design and implement robust CI/CD pipelines for data workflows using and maintain custom Python scripts for data processing and transformation DBT for building and managing data models within the data and manage infrastructure on cloud platforms (e.g. AWS Azure GCP).Implement containerization technologies like Docker and and troubleshoot data pipelines and infrastructure for performance and security best practices and ensure data governance with data engineers to understand their needs and translate them into technical up-to-date with the latest advancements in DevOps tools and Skills:7 years of experience in a DevOps or related hands-on experience with Airflow Python and in scripting languages like Bash Shell with cloud platforms (AWS Azure GCP preferred).Familiarity with containerization technologies (Docker Kubernetes).Strong understanding of CI/CD principles and of data security and governance best Points:Experience with data lineage and data quality with cloud-based data warehousing solutions.