About us
Graphcoreis one of the worlds leading innovators in Artificial Intelligencecompute. It is developing hardware software and systems infrastructure that will unlock the next generation of AI breakthroughs and power the widespread adoption of AI solutions across every industry.
As part of the SoftBank GroupGraphcoreis a member of an elite family of companies responsible for some of the worlds most transformative technologies. Together they share a bold vision: to enable Artificial Super Intelligence and ensure its benefits are accessible to everyone.
Graphcoresteams are drawn from diverse backgrounds and bring a broad range of skills and perspectives. A melting pot of AI research specialists silicon designers softwareengineersand systems architectsGraphcorebrings together deepexpertiseto solve complex problems and deliver meaningful progress in AIcompute.
Job Summary
Reporting to the Head of Data & Analytics the Lead Data Engineer is a senior individual contributor responsible for leading a key area ofGraphcoresdata platform and engineering practices. This role combines hands-on technical delivery with technical leadership across data pipelines platform capabilities and data products that support analyticsreportingand operational decision-making. Working closely with stakeholders across technical and business functions the Lead Data Engineer helps shape the direction of the data platform drives improvements to reliabilityscalabilityand governance and enables teams acrossGraphcoreto make better use of trusted data.
The Team
The Data & Analytics team enables better decision-making acrossGraphcoreby building trusted data foundations scalableplatformsand high-quality data products. The team works across a broad range of business and technical domains partnering with colleagues throughout the company to improve access to reliable information strengthen operationalinsightand support efficient data-informed ways of working. Within this team the Lead Data Engineer plays a key role in evolving the platform setting engineeringstandardsand delivering robust solutions that scale with business needs.
Responsibilities and Duties
- Lead the design build and evolution of robust data pipelines and platform services that support analytics reporting and operational use cases acrossGraphcore.
- Own the data engineering stackplanningand delivering improvements to reliability scalability maintainabilityperformanceand security.
- Build andoperatePython-based batch and streaming workflows with clear approaches to orchestration testing deploymentmonitoringand incident resolution.
- Design and implement data solutions on AWS using services such as S3 Lambda Aurora PostgreSQL Athena Glue and Redshift ensuring they are secureresilientand cost-conscious.
- Define and apply engineering standards for data quality observability documentation releaseprocessesand operational support.
- Partner with analystsengineersand business stakeholders to translate requirements into trusted datasets well-structured datamodelsand reusable data products.
- Drive improvements to platform resilience through approaches such as idempotent processing retry and recovery mechanisms buffering strategies and backfill or replay capabilities.
- Lead technical decision-making in your area by reviewing designs and code sharingexpertiseand helping to raise the quality bar for data engineering across the team.
- Build andmaintainCI/CD workflows and development practices that enable saferepeatableand efficient delivery of data infrastructure and workflows.
- Ensureappropriate dataprotection and access controls are in place including least-privilege access secure secrets handling and suitable database permissions.
- Contribute to the development of internal tools and lightweight applications that improve access to data and support self-serve workflows.
- Work across teams toidentifyopportunities for platform and process improvements helping shape the direction of data engineering within the wider Data & Analytics function.
Candidate Profile
Essential
- Strong experience designingbuildingandoperatingproduction-grade data pipelines and data platforms in Python.
- Strong hands-on experience with modern data orchestration testingdeploymentand monitoring practices in a production environment.
- Experience building solutionsonAWS data services including storageprocessingand query technologies.
- Strong understanding of data modelling data quality schemadesignand performanceoptimisationacross relational and analytical systems.
- Experience designing reliable data systems that recover gracefully from failure andoperateeffectively in real-world production conditions.
- Experience working with batch and streaming data pipelines including operational supporttroubleshootingand continuous improvement.
- Strong knowledge of security and access control principles for data platforms including IAM databasepermissionsand secure handling of credentials and secrets.
- Experience providing technical leadership as a senior individual contributor through design reviews code reviews standards-setting and mentoring of others.
- Ability to work effectively with both technical and non-technical stakeholders turning business needs into practical scalable data solutions.
- Strong communicationskills with the ability to explain technical decisions clearly and influence outcomes across teams.
Desirable
- Experience with Prefect or a similar workflow orchestration platform.
- Experience withstreaming or data collection technologies.
- Experience with PostgreSQL RedshiftClickHouseor similar database and warehouse technologies.
- Experience with CI/CD tooling and Infrastructure as Code approaches.
- Experience building lightweight internal tools or data applications using Python frameworks such asStreamlitor Flask.
- Familiarity withdbtand working models that combine data engineering and analytics engineering.
- Understanding ofoperational best practices for cloud-based data platforms including costoptimisationand observability.
- Experience working in a fast-moving producttechnologyor engineering-led environment.
Required Experience:
IC
About usGraphcoreis one of the worlds leading innovators in Artificial Intelligencecompute. It is developing hardware software and systems infrastructure that will unlock the next generation of AI breakthroughs and power the widespread adoption of AI solutions across every industry.As part of the So...
About us
Graphcoreis one of the worlds leading innovators in Artificial Intelligencecompute. It is developing hardware software and systems infrastructure that will unlock the next generation of AI breakthroughs and power the widespread adoption of AI solutions across every industry.
As part of the SoftBank GroupGraphcoreis a member of an elite family of companies responsible for some of the worlds most transformative technologies. Together they share a bold vision: to enable Artificial Super Intelligence and ensure its benefits are accessible to everyone.
Graphcoresteams are drawn from diverse backgrounds and bring a broad range of skills and perspectives. A melting pot of AI research specialists silicon designers softwareengineersand systems architectsGraphcorebrings together deepexpertiseto solve complex problems and deliver meaningful progress in AIcompute.
Job Summary
Reporting to the Head of Data & Analytics the Lead Data Engineer is a senior individual contributor responsible for leading a key area ofGraphcoresdata platform and engineering practices. This role combines hands-on technical delivery with technical leadership across data pipelines platform capabilities and data products that support analyticsreportingand operational decision-making. Working closely with stakeholders across technical and business functions the Lead Data Engineer helps shape the direction of the data platform drives improvements to reliabilityscalabilityand governance and enables teams acrossGraphcoreto make better use of trusted data.
The Team
The Data & Analytics team enables better decision-making acrossGraphcoreby building trusted data foundations scalableplatformsand high-quality data products. The team works across a broad range of business and technical domains partnering with colleagues throughout the company to improve access to reliable information strengthen operationalinsightand support efficient data-informed ways of working. Within this team the Lead Data Engineer plays a key role in evolving the platform setting engineeringstandardsand delivering robust solutions that scale with business needs.
Responsibilities and Duties
- Lead the design build and evolution of robust data pipelines and platform services that support analytics reporting and operational use cases acrossGraphcore.
- Own the data engineering stackplanningand delivering improvements to reliability scalability maintainabilityperformanceand security.
- Build andoperatePython-based batch and streaming workflows with clear approaches to orchestration testing deploymentmonitoringand incident resolution.
- Design and implement data solutions on AWS using services such as S3 Lambda Aurora PostgreSQL Athena Glue and Redshift ensuring they are secureresilientand cost-conscious.
- Define and apply engineering standards for data quality observability documentation releaseprocessesand operational support.
- Partner with analystsengineersand business stakeholders to translate requirements into trusted datasets well-structured datamodelsand reusable data products.
- Drive improvements to platform resilience through approaches such as idempotent processing retry and recovery mechanisms buffering strategies and backfill or replay capabilities.
- Lead technical decision-making in your area by reviewing designs and code sharingexpertiseand helping to raise the quality bar for data engineering across the team.
- Build andmaintainCI/CD workflows and development practices that enable saferepeatableand efficient delivery of data infrastructure and workflows.
- Ensureappropriate dataprotection and access controls are in place including least-privilege access secure secrets handling and suitable database permissions.
- Contribute to the development of internal tools and lightweight applications that improve access to data and support self-serve workflows.
- Work across teams toidentifyopportunities for platform and process improvements helping shape the direction of data engineering within the wider Data & Analytics function.
Candidate Profile
Essential
- Strong experience designingbuildingandoperatingproduction-grade data pipelines and data platforms in Python.
- Strong hands-on experience with modern data orchestration testingdeploymentand monitoring practices in a production environment.
- Experience building solutionsonAWS data services including storageprocessingand query technologies.
- Strong understanding of data modelling data quality schemadesignand performanceoptimisationacross relational and analytical systems.
- Experience designing reliable data systems that recover gracefully from failure andoperateeffectively in real-world production conditions.
- Experience working with batch and streaming data pipelines including operational supporttroubleshootingand continuous improvement.
- Strong knowledge of security and access control principles for data platforms including IAM databasepermissionsand secure handling of credentials and secrets.
- Experience providing technical leadership as a senior individual contributor through design reviews code reviews standards-setting and mentoring of others.
- Ability to work effectively with both technical and non-technical stakeholders turning business needs into practical scalable data solutions.
- Strong communicationskills with the ability to explain technical decisions clearly and influence outcomes across teams.
Desirable
- Experience with Prefect or a similar workflow orchestration platform.
- Experience withstreaming or data collection technologies.
- Experience with PostgreSQL RedshiftClickHouseor similar database and warehouse technologies.
- Experience with CI/CD tooling and Infrastructure as Code approaches.
- Experience building lightweight internal tools or data applications using Python frameworks such asStreamlitor Flask.
- Familiarity withdbtand working models that combine data engineering and analytics engineering.
- Understanding ofoperational best practices for cloud-based data platforms including costoptimisationand observability.
- Experience working in a fast-moving producttechnologyor engineering-led environment.
Required Experience:
IC
View more
View less