Design develop and maintain scalable and efficient data pipelines using Snowflake SQL and ETL existing ETL workflows to Snowflake and optimize for performance and and manage data warehouse architectures data models and best with data architects business analysts and other stakeholders to gather requirements and ensure data quality and with large datasets design data transformation frameworks and develop solutions to support analytics and and implement monitoring alerting and performance tuning data security privacy and governance compliance across data technical solutions data flows and system Skills & Qualifications:Bachelors or Masters degree in Computer Science Information Systems or a related field.10 to 12 years of professional experience in Data Engineering / ETL / Data 3 to 5 years of hands-on experience with in SQL Python and at least one ETL tool (e.g. Informatica Talend SSIS DataStage DBT).Deep understanding of data modeling (Star Snowflake schema) data marts and dimensional with cloud platforms (AWS Azure or GCP) and cloud-native data integration performance tuning troubleshooting and debugging skills for Snowflake and ETL with CI/CD practices and version control tools like communication documentation and interpersonal skills.