Develop implement and maintain data models and transformation logic using DBT (Data Build Tool) and Snowflake.Design and build scalable and efficient data pipelines to transform and load data into Snowflake from various data sources.Collaborate with data scientists analysts and other engineers to understand business requirements and deliver solutions that meet those needs.Optimize existing data models queries and transformations to improve performance reliability and scalability.Develop and maintain DBT projects to manage data transformations ensuring consistency versioning and efficient .Write clean maintainable and welldocumented SQL code for data transformation and modeling.Implement data quality checks and monitoring processes to ensure the accuracy and completeness of data.Work with cloudbased data storage and processing technologies (Snowflake S3 etc. to manage large datasets effectively.Conduct code reviews and ensure adherence to data engineering best practices.Stay updated with the latest trends in data engineering cloud technologies and DBT/Snowflake best practices.Required Skills and Qualifications:Proven experience as a Data Engineer with a strong focus on Snowflake and DBT.Strong proficiency in SQL and experience working with large complex datasets.Handson experience with DBT to build test and deploy data transformation workflows.Indepth knowledge of Snowflake data warehouse architecture features and best practices.Experience with cloud platforms (e.g. AWS) and data storage technologies like S3.Strong understanding of data modeling ETL/ELT processes and data transformation techniques.Ability to optimize queries and transformations for performance and scalability.Experience working with version control tools (e.g. Git).Ability to write clear and maintainable code following industry best practices.Familiarity with data visualization tools and technologies (e.g. power BI) is a plus.Strong analytical and problemsolving skills.Excellent communication and collaboration skills to work effectively with both technical and nontechnical teams.Preferred Skills:Experience with Python or other programming languages for data engineering.Familiarity with workflow orchestration tools (e.g. Airflow).Knowledge of data governance security and privacy practices.Experience with Data Lake architectures and integration.Exposure to DevOps practices and CI/CD pipelines for data engineering.Added advantage on Fivetran.