Engineering
We are seeking a highly skilled Data Engineer with strong expertise in Snowflake ETL/ELT concepts and dbt to design build and optimize scalable data pipelines. The ideal candidate will have advanced SQL skills experience with cloud-based data platforms and a strong understanding of data warehousing best practices.
Key Responsibilities
Design develop and maintain scalable data pipelines using Snowflake and dbt
Write and optimize advanced SQL queries for performance and reliability
Implement ETL/ELT processes to ingest and transform data from multiple sources
Develop Python scripts for automation data processing and API integrations
Build and manage data workflows using AWS services such as Glue Lambda S3 and CloudFormation
Design and maintain data warehouse models schemas and transformations
Collaborate with cross-functional teams to understand data requirements and deliver analytical solutions
Implement and maintain version control CI/CD pipelines and best development practices
Monitor troubleshoot and optimize data pipelines for performance and cost efficiency
Qualifications :
Required Skills
Strong hands-on experience with Snowflake
Advanced SQL proficiency
Strong understanding of ETL/ELT concepts and data pipelines
Hands-on experience with dbt
Solid knowledge of data warehousing concepts including schema design and data modeling
Proficiency in Python for scripting and automation
Good to Have Skills
Experience with AWS services (Glue Lambda S3 CloudFormation)
Familiarity with Git and CI/CD practices
Understanding of APIs and CRUD operations
Exposure to cloud-native data architectures
Remote Work :
Yes
Employment Type :
Full-time
Nagarro helps future-proof your business through a forward-thinking, fluidic, and CARING mindset. We excel at digital engineering and help our clients become human-centric, digital-first organizations, augmenting their ability to be responsive, efficient, intimate, creative, and susta ... View more