Role Overview
We are looking for a Senior ETL Developer to design develop and maintain data integration pipelines feeding into our Snowflake Data Warehouse. The ideal candidate will work with diverse data sources build efficient and scalable ETL/ELT processes and ensure reliable high-quality data delivery to support analytics and reporting needs.
Key Responsibilities
Design develop and maintain ETL/ELT processes to load data into Snowflake.
Integrate data from multiple sources (databases APIs SaaS platforms flat files cloud services).
Write and optimize SQL queries stored procedures and transformations.
Work closely with the ETL Architect to implement best practices for data integration and performance.
Conduct data profiling validation and quality checks to ensure accuracy and completeness.
Monitor and troubleshoot ETL workflows ensuring timely resolution of issues.
Collaborate with data architects BI developers and business stakeholders to deliver reliable datasets.
Participate in performance tuning cost optimization and workload management in Snowflake.
Contribute to process automation and continuous improvement initiatives.
Required Skills & Qualifications
7 10 years of experience in ETL/ELT development.
Strong hands-on experience with ETL tools (Informatica Talend Matillion SSIS AWS Glue DBT Fivetran).
Expertise in Snowflake including query tuning warehouse optimization and data loading techniques.
Strong proficiency in SQL and relational database concepts.
Experience with varied data sources such as Oracle SQL Server SAP Salesforce REST APIs flat files.
Solid understanding of data warehousing concepts (star schema snowflake schema dimensional modeling).
Hands-on experience with at least one cloud platform (AWS/Azure/GCP).
Good communication and problem-solving skills.
Nice to Have
Experience with real-time/streaming pipelines (Kafka Kinesis Snowpipe).
Familiarity with data governance and cataloging tools.
Exposure to BI tools (Power BI Tableau Qlik).
Basic knowledge of scripting (Python Shell) for automation.
Role Overview We are looking for a Senior ETL Developer to design develop and maintain data integration pipelines feeding into our Snowflake Data Warehouse. The ideal candidate will work with diverse data sources build efficient and scalable ETL/ELT processes and ensure reliable high-quality data de...
Role Overview
We are looking for a Senior ETL Developer to design develop and maintain data integration pipelines feeding into our Snowflake Data Warehouse. The ideal candidate will work with diverse data sources build efficient and scalable ETL/ELT processes and ensure reliable high-quality data delivery to support analytics and reporting needs.
Key Responsibilities
Design develop and maintain ETL/ELT processes to load data into Snowflake.
Integrate data from multiple sources (databases APIs SaaS platforms flat files cloud services).
Write and optimize SQL queries stored procedures and transformations.
Work closely with the ETL Architect to implement best practices for data integration and performance.
Conduct data profiling validation and quality checks to ensure accuracy and completeness.
Monitor and troubleshoot ETL workflows ensuring timely resolution of issues.
Collaborate with data architects BI developers and business stakeholders to deliver reliable datasets.
Participate in performance tuning cost optimization and workload management in Snowflake.
Contribute to process automation and continuous improvement initiatives.
Required Skills & Qualifications
7 10 years of experience in ETL/ELT development.
Strong hands-on experience with ETL tools (Informatica Talend Matillion SSIS AWS Glue DBT Fivetran).
Expertise in Snowflake including query tuning warehouse optimization and data loading techniques.
Strong proficiency in SQL and relational database concepts.
Experience with varied data sources such as Oracle SQL Server SAP Salesforce REST APIs flat files.
Solid understanding of data warehousing concepts (star schema snowflake schema dimensional modeling).
Hands-on experience with at least one cloud platform (AWS/Azure/GCP).
Good communication and problem-solving skills.
Nice to Have
Experience with real-time/streaming pipelines (Kafka Kinesis Snowpipe).
Familiarity with data governance and cataloging tools.
Exposure to BI tools (Power BI Tableau Qlik).
Basic knowledge of scripting (Python Shell) for automation.
View more
View less