Snowflake Data Engineer
Location: Pune/ Nagpur
Duration: Full Time
Responsibilities of Snowflake Data Engineer
- We are looking for a savvy Data Engineer to join our growing team of analytics experts.
- The hire will be responsible for expanding and optimizing our data and data pipeline architecture as well as optimizing data flow and collection for cross functional teams.
- The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
- The Data Engineer will support our software developers database architects data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
- They must be selfdirected and comfortable supporting the data needs of multiple teams systems and products.
- The right candidate willbe excited by the prospect of optimizing or even redesigning our companys data architecture to support our next generation of products and data initiatives.
Requirements for Data Engineer
- SQL Python Snowflake Data Modelling ETL Snowpark
- Proficiency in crafting and optimizing complex SQL queries and Stored Procedures for data transformation aggregation and analysis within the Snowflake platform.
- Experience with Snowflake cloud data warehousing service including data loading querying and administration.
- Ability to design and implement data models applying both relational and dimensional modelling techniques within Snowflake.
- Indepth understanding of ETL processes and methodologies leveraging Snowflakes capabilities.
- Familiarity with DBT (Data Build Tool) for data transformation and modelling within Snowflake.
- Expertise in integrating Snowflake with AWS S3 storage for data storage and retrieval.
- Proficiency in Snowpark enabling data processing using Snowflakes native programming language.
- Skill in API integration specifically integrating Snowflake with AWS Lambda for data workflows.
- Adeptness in version control using GitHub for collaborative code management.
- Adeptness in troubleshooting datarelated issues within the Snowflake ecosystem ensuring data quality and consistency.
- Skill in creating clear and concise technical documentation facilitating communication and knowledge sharing.
- Designing efficient and wellstructured data schemas within Snowflake.
- Utilizing Snowflakes features for data warehousing scalability and performance optimization.
- Leveraging Python programming for data processing manipulation and analysis in a Snowflake environment.
- Implementing data integration and transformation workflows using DBT.
- Writing and maintaining scripts for data movement and processing using cloud integration.