Data Engineering Lead - Snowflake/Matillion/dbt
Location: Hyderabad
Experience: 7 to 10 Years
Experience:
- 7 to 10 years of experience in data engineering with hands-on expertise in data pipeline development architecture and system optimization.
- Proven track record in leading data engineering teams and managing end-to-end project delivery.
Technical Skills:
- Expertise in programming languages such as Python or Scala.
- Proficiency in designing and delivering data pipelines in Cloud Data Warehouses (e.g. Snowflake Redshift) using various ETL/ELT tools such as Matillion dbt Striim etc.
- Solid understanding of database systems (relational and NoSQL) and data modeling techniques.
- Hands-on experience of 2 years in designing and developing data integration solutions using Matillion and/or dbt. o Strong knowledge of data engineering and integration frameworks.
- Expertise in architecting data solutions.
- Successfully implemented at least two end-to-end projects with multiple transformation layers.
- Good grasp of coding standards with the ability to define standards and testing strategies for projects.
- Proficiency in working with cloud platforms (AWS Azure GCP) and associated data services.
- Enthusiastic about working in Agile methodology.
- Possess a comprehensive understanding of the DevOps process including GitHub integration and CI/CD pipelines.
- Experience working with containerization (Docker) and orchestration tools (such as Airflow Control-M).
Job Description
We are looking for an accomplished and dynamic Data Engineering Lead to join our team and drive the design development and delivery of cutting-edge data solutions. This role requires a balance of strong technical expertise strategic leadership and a consulting mindset. As the Lead Data Engineer you will oversee the design and development of robust data pipelines and systems manage and mentor a team of 5 to 7 engineers and play a critical role in architecting innovative solutions tailored to client needs.
You will lead by example fostering a culture of accountability ownership and continuous improvement while delivering impactful scalable data solutions in a fast-paced consulting environment.
Key Responsibilities
1. Data Solution Design and Development:
- Architect design and implement end-to-end data pipelines and systems that handle largescale complex datasets.
- Ensure optimal system architecture for performance scalability and reliability.
- Evaluate and integrate new technologies to enhance existing solutions.
- Implement best practices in ETL/ELT processes data integration and data warehousing.
2. Project Leadership and Delivery:
- Lead technical project ensuring timelines and deliverables are met with high quality.
- Collaborate with cross-functional teams to align business goals with technical solutions.
- Act as the primary point of contact for clients translating business requirements into actionable technical strategies.
3. Team Leadership and Development:
- Manage mentor and grow a team of 5 to 7 data engineers.
- Conduct code reviews validations and provide feedback to ensure adherence to technical standards.
- Provide technical guidance and foster an environment of continuous learning innovation and collaboration.
4. Optimization and Performance Tuning:
- Analyze and optimize existing data workflows for performance and cost-efficiency.
- Troubleshoot and resolve complex technical issues within data systems.
5. Adaptability and Innovation:
- Embrace a consulting mindset with the ability to quickly learn and adopt new tools technologies and frameworks.
- Identify opportunities for innovation and implement cutting-edge technologies in data engineering.
- Exhibit a figure it out attitude taking ownership and accountability for challenges and solutions.
6. Client Collaboration:
- Engage with stakeholders to understand requirements and ensure alignment throughout the project lifecycle.
- Present technical concepts and designs to both technical and non-technical audiences.
- Communicate effectively with stakeholders to ensure alignment on project goals timelines and deliverables.
7. Learning and Adaptability:
- Stay updated with emerging data technologies frameworks and tools.
- Actively explore and integrate new technologies to improve existing workflows and solutions.
8. Internal Initiatives and Eminence Building:
- Drive internal initiatives to improve processes frameworks and methodologies.
- Contribute to the organization s eminence by developing thought leadership sharing best practices and participating in knowledge-sharing activities.
data solutions,data modeling,python,data engineering,azure,github,integration,redshift,control-m,scala,gcp,docker,airflow,striim,aws,ci/cd,snowflake,matillion,dbt