drjobs Data Engineer Snowflake Airflow

Data Engineer Snowflake Airflow

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Delhi - India

Monthly Salary drjobs

INR 1800000 - 2500000

Vacancy

1 Vacancy

Job Description

Sr. Data Engineer (Snowflake Airflow)
Experience: 5-8 Years
Work Mode: Remote
Job Type: Fulltime
Mandatory Skills: Python Pyspark SQL Snowflake Airflow ETL Data Pipelines Elastic Search or AWS.


Role Overview:


We are looking for a talented and passionate Senior Data Engineer to join our growing data team. In this role you will play a key part in building and scaling our data infrastructure enabling data-driven decision-making across the organization. You will be responsible for designing developing and maintaining efficient and reliable data pipelines for both ELT (Extract Load Transform) and ETL (Extract Transform Load) processes.


Responsibilities:


Design develop and maintain robust and scalable data pipelines for ELT and ETL processes
ensuring data accuracy completeness and timeliness.
Work with stakeholders to understand data requirements and translate them into efficient data
models and pipelines.
Build and optimize data pipelines using a variety of technologies including Elastic Search AWS S3 Snowflake and NFS.
Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs.
Implement data quality checks and monitoring to ensure data integrity and identify potential issues.
Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes.
Stay current with industry best practices CI/CD/DevSecFinOps Scrum and emerging technologies in data engineering.
Contribute to the development and enhancement of our data warehouse architecture.


Required Skills:


Bachelors degree in Computer Science Engineering or a related field.
5 years of experience as a Data Engineer with a strong focus on ELT/ETL processes.
At least 3 years of exp in Snowflake data warehousing technologies.
At least 3 years of exp in creating and maintaining Airflow ETL pipelines.
Minimum 3 years of professional level experience with Python languages for data
manipulation and automation.
Working experience with Elastic Search and its application in data pipelines.
Proficiency in SQL and experience with data modelling techniques.
Strong understanding of cloud-based data storage solutions such as AWS S3.
Experience working with NFS and other file storage systems.
Excellent problem-solving and analytical skills.
Strong communication and collaboration skills.

elastic search,etl,data pipelines,snowflake,airflow,python,pyspark,sql,pipelines,aws

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.