About the Role
We are seeking a highly experienced Lead Data Engineer to drive the design development and delivery of scalable data solutions. This role combines hands-on technical leadership with project ownership close client collaboration and team mentorship within complex cloud-based data environments.
Key Responsibilities
Lead and manage end-to-end project delivery ensuring quality timeliness and scalability.
Collaborate with clients and offshore teams to gather requirements and define high-level solution designs.
Oversee development and unit testing activities aligned with architectural standards.
Provide clear and consistent communication to stakeholders on project progress and risks.
Proactively identify track and mitigate project risks and issues.
Partner with management to ensure smooth execution and delivery.
Mentor and guide the engineering team on technical design implementation and best practices.
Required Qualifications
8 years of proven experience in Data Engineering and end-to-end project delivery.
Strong background in Big Data architectures and cloud-based implementations.
Advanced expertise in Snowflake SQL and AWS services (Glue EMR S3 Aurora RDS overall AWS architecture).
Extensive hands-on experience with Python PySpark and AWS cloud microservices.
Strong capabilities in coding debugging performance optimization and production deployments.
Excellent analytical problem-solving and ownership mindset.
Solid experience working within Agile methodologies and iterative delivery models.
Ability to quickly learn new technologies and lead teams through adoption.
Strong communication skills with the ability to manage technical and business stakeholders effectively.
Education
Preferred Qualifications
Experience with DevOps tools and CI/CD pipelines (e.g. Jenkins Git).
Strong knowledge of Spark advanced SQL and Big Data frameworks.
Experience supporting cloud migration and modernization initiatives.
Exposure to the U.S. insurance or reinsurance domain.
Knowledge of Data Vault 2.0 modeling.
About the RoleWe are seeking a highly experienced Lead Data Engineer to drive the design development and delivery of scalable data solutions. This role combines hands-on technical leadership with project ownership close client collaboration and team mentorship within complex cloud-based data environ...
About the Role
We are seeking a highly experienced Lead Data Engineer to drive the design development and delivery of scalable data solutions. This role combines hands-on technical leadership with project ownership close client collaboration and team mentorship within complex cloud-based data environments.
Key Responsibilities
Lead and manage end-to-end project delivery ensuring quality timeliness and scalability.
Collaborate with clients and offshore teams to gather requirements and define high-level solution designs.
Oversee development and unit testing activities aligned with architectural standards.
Provide clear and consistent communication to stakeholders on project progress and risks.
Proactively identify track and mitigate project risks and issues.
Partner with management to ensure smooth execution and delivery.
Mentor and guide the engineering team on technical design implementation and best practices.
Required Qualifications
8 years of proven experience in Data Engineering and end-to-end project delivery.
Strong background in Big Data architectures and cloud-based implementations.
Advanced expertise in Snowflake SQL and AWS services (Glue EMR S3 Aurora RDS overall AWS architecture).
Extensive hands-on experience with Python PySpark and AWS cloud microservices.
Strong capabilities in coding debugging performance optimization and production deployments.
Excellent analytical problem-solving and ownership mindset.
Solid experience working within Agile methodologies and iterative delivery models.
Ability to quickly learn new technologies and lead teams through adoption.
Strong communication skills with the ability to manage technical and business stakeholders effectively.
Education
Preferred Qualifications
Experience with DevOps tools and CI/CD pipelines (e.g. Jenkins Git).
Strong knowledge of Spark advanced SQL and Big Data frameworks.
Experience supporting cloud migration and modernization initiatives.
Exposure to the U.S. insurance or reinsurance domain.
Knowledge of Data Vault 2.0 modeling.
View more
View less