Senior Data Engineer
Location: Hyderabad
Experience: 5 to 8 Years
Mandatory Skills-Python or Scala (Programming Languages) Data Warehousing ETL Tools Relational and Non-Relational Databases Any Cloud
Experience:
- 5 to 8 years of experience in data engineering with a strong focus on pipeline development and data modeling.
- Proven experience in leading small teams or mentoring junior engineers.
Technical Skills:
- Expertise in programming languages such as Python or Scala.
- Proficiency in designing and constructing data pipelines for ingesting raw data into Cloud Data Warehouses (e.g. Snowflake Redshift) cleansing and transforming data to meet required specifications using various ETL/ELT tools such as Matillion dbt Striim etc.
- Hands-on experience with relational and NoSQL databases and data modeling techniques.
- Hands-on experience of 2 years in designing and developing data integration solutions using Matillion and/or dbt.
- Strong knowledge of data engineering and integration frameworks.
- Successfully implemented at least one end-to-end project with multiple transformation layers.
- Good grasp of coding standards with the ability to define standards and testing strategies for projects.
- Proficiency in working with cloud platforms (AWS Azure GCP) and associated data services.
- Enthusiastic about working in Agile methodology.
- Possess a comprehensive understanding of the DevOps process including GitHub integration and CI/CD pipelines.
- Familiarity with containerization (Docker) and orchestration tools (such as Airflow Control-M).
About the role
We are seeking an experienced and innovative Senior Data Engineer to lead the design development and implementation of robust data pipelines and solutions for our clients. The ideal candidate will possess strong technical expertise excellent leadership skills and the drive to continuously learn and innovate.
As a Senior Data Engineer you will lead a team of 3 to 5 talented engineers ensuring high-quality deliverables while fostering a culture of innovation collaboration and technical excellence. You will also coordinate with clients to gather requirements design technical solutions and drive seamless project .
Key Responsibilities
1. Data Pipeline Development:
- Design build and optimize scalable data pipelines to process and transform large datasets.
- Implement best practices in ETL/ELT processes data integration and data warehousing.
2. Team Leadership:
- Lead mentor and manage a team of 3 to 5 data engineers.
- Review team members work to ensure adherence to technical standards and project timelines.
- Provide technical guidance and foster skill development within the team.
3. Client Collaboration:
- Work closely with clients to understand business requirements and translate them into technical solutions.
- Communicate effectively with stakeholders to ensure alignment on project goals timelines and deliverables.
4. Quality Assurance and Innovation:
- Conduct code reviews and validations to maintain high-quality standards and optimize system performance.
- Identify opportunities for innovation and implement cutting-edge technologies in data engineering.
5. Internal Initiatives and Eminence Building:
- Drive internal initiatives to improve processes frameworks and methodologies.
- Contribute to the organization s eminence by developing thought leadership sharing best practices and participating in knowledge-sharing activities.
6. Learning and Adaptability:
- Stay updated with emerging data technologies frameworks and tools.
- Actively explore and integrate new technologies to improve existing workflows and solutions.
- Exhibit a figure it out attitude taking ownership and accountability for challenges and solutions.
aws,amazon redshift,control-m,etl,striim,data warehousing,data engineering,ci/cd pipelines,redshift,github,azure,etl tools,python,cloud platforms (aws, azure, gcp),snowflake,ci/cd,pipelines,data,docker,airflow,scala,cloud data warehouses,matillion,relational databases,non-relational databases,gcp,cloud technologies,dbt