Graphcore Logo

Graphcore

Lead Data Engineer

Posted 14 Days Ago
Be an Early Applicant
Hybrid
Bristol, England, GBR
Senior level
Hybrid
Bristol, England, GBR
Senior level
The Lead Data Engineer will design and oversee data pipelines and platforms, ensuring reliability, scalability, and governance while collaborating with various stakeholders.
The summary above was generated by AI

About us 

Graphcore is one of the world’s leading innovators in Artificial Intelligence compute. It is developing hardware, software and systems infrastructure that will unlock the next generation of AI breakthroughs and power the widespread adoption of AI solutions across every industry. 

As part of the SoftBank Group, Graphcore is a member of an elite family of companies responsible for some of the world’s most transformative technologies. Together, they share a bold vision: to enable Artificial Super Intelligence and ensure its benefits are accessible to everyone. 

Graphcore’s teams are drawn from diverse backgrounds and bring a broad range of skills and perspectives. A melting pot of AI research specialists, silicon designers, software engineers and systems architects, Graphcore brings together deep expertise to solve complex problems and deliver meaningful progress in AI compute. 

Job Summary 

Reporting to the Head of Data & Analytics, the Lead Data Engineer is a senior individual contributor responsible for leading a key area of Graphcore’s data platform and engineering practices. This role combines hands-on technical delivery with technical leadership across data pipelines, platform capabilities and data products that support analytics, reportingand operational decision-making. Working closely with stakeholders across technical and business functions, the Lead Data Engineer helps shape the direction of the data platform, drives improvements to reliability, scalability and governance, and enables teams across Graphcore to make better use of trusted data. 

The Team 

The Data & Analytics team enables better decision-making across Graphcore by building trusted data foundations, scalable platforms and high-quality data products. The team works across a broad range of business and technical domains, partnering with colleagues throughout the company to improve access to reliable information, strengthen operational insightand support efficient, data-informed ways of working. Within this team, the Lead Data Engineer plays a key role in evolving the platform, setting engineering standards and delivering robust solutions that scale with business needs. 

Responsibilities and Duties 

  • Lead the design, build and evolution of robust data pipelines and platform services that support analytics, reporting and operational use cases across Graphcore. 
  • Own the data engineering stack, planning and delivering improvements to reliability, scalability, maintainability, performance and security. 
  • Build and operate Python-based batch and streaming workflows, with clear approaches to orchestration, testing, deployment, monitoring and incident resolution. 
  • Design and implement data solutions on AWS using services such as S3, Lambda, Aurora PostgreSQL, Athena, Glue and Redshift, ensuring they are secure, resilient and cost-conscious. 
  • Define and apply engineering standards for data quality, observability, documentation, release processes and operational support. 
  • Partner with analysts, engineers and business stakeholders to translate requirements into trusted datasets, well-structured data models and reusable data products. 
  • Drive improvements to platform resilience through approaches such as idempotent processing, retry and recovery mechanisms, buffering strategies and backfill or replay capabilities. 
  • Lead technical decision-making in your area by reviewing designs and code, sharing expertise and helping to raise the quality bar for data engineering across the team. 
  • Build and maintain CI/CD workflows and development practices that enable safe, repeatable and efficient delivery of data infrastructure and workflows. 
  • Ensure appropriate data protection and access controls are in place, including least-privilege access, secure secrets handling and suitable database permissions. 
  • Contribute to the development of internal tools and lightweight applications that improve access to data and support self-serve workflows. 
  • Work across teams to identify opportunities for platform and process improvements, helping shape the direction of data engineering within the wider Data & Analytics function. 

Candidate Profile 

Essential 

  • Strong experience designing, building and operating production-grade data pipelines and data platforms in Python. 
  • Strong hands-on experience with modern data orchestration, testing, deployment and monitoring practices in a production environment. 
  • Experience building solutions on AWS data services, including storage, processing and query technologies. 
  • Strong understanding of data modelling, data quality, schema design and performance optimisation across relational and analytical systems. 
  • Experience designing reliable data systems that recover gracefully from failure and operate effectively in real-world production conditions. 
  • Experience working with batch and streaming data pipelines, including operational support, troubleshooting and continuous improvement. 
  • Strong knowledge of security and access control principles for data platforms, including IAM, database permissions and secure handling of credentials and secrets. 
  • Experience providing technical leadership as a senior individual contributor through design reviews, code reviews, standards-setting and mentoring of others. 
  • Ability to work effectively with both technical and non-technical stakeholders, turning business needs into practical, scalable data solutions. 
  • Strong communication skills, with the ability to explain technical decisions clearly and influence outcomes across teams. 

Desirable 

  • Experience with Prefect or a similar workflow orchestration platform. 
  • Experience with streaming or data collection technologies. 
  • Experience with PostgreSQL, Redshift, ClickHouse or similar database and warehouse technologies. 
  • Experience with CI/CD tooling and Infrastructure as Code approaches. 
  • Experience building lightweight internal tools or data applications using Python frameworks such as Streamlit or Flask. 
  • Familiarity with dbt and working models that combine data engineering and analytics engineering. 
  • Understanding of operational best practices for cloud-based data platforms, including cost optimisation and observability. 
  • Experience working in a fast-moving product, technology or engineering-led environment. 
HQ

Graphcore Bristol, England Office

Graphcore Headquarters Office

Wine Street, Bristol, United Kingdom, BS1 2PH

Similar Jobs at Graphcore

2 Hours Ago
Hybrid
Bristol, England, GBR
Senior level
Senior level
Artificial Intelligence • Semiconductor
Lead reliability strategy and execution for advanced silicon and packaging technologies, ensuring product reliability from technology selection to high-volume manufacturing. Responsibilities include conducting risk assessments, qualification testing, and root cause analysis, while collaborating with cross-functional teams.
Top Skills: CsamElectrical EngineeringHtolJedec StandardsMaterials ScienceMechanical EngineeringPhysicsReflow TestingReliability EngineeringSemiconductor ReliabilityThermal CyclingThermal ShockX-Ray
13 Hours Ago
Hybrid
Bristol, England, GBR
Internship
Internship
Artificial Intelligence • Semiconductor
Intern in Platform QA supporting test execution, result repositories, and reporting. Work with component or integration QA to exercise systems at scale, focusing on workload management, cloud development, and observability while automating CI pipelines and providing actionable feedback.
Top Skills: AtlassianC++Ci/CdContinuous IntegrationDockerGithub ActionsGitlab ActionsGoInfrastructure-As-CodeJenkinsKubernetesLinuxProvisioningPythonRayVirtualisation
13 Hours Ago
Hybrid
Bristol, England, GBR
Senior level
Senior level
Artificial Intelligence • Semiconductor
Lead verification planning, test generation, failure diagnosis, and functional coverage closure for Graphcore silicon. Provide architect feedback, contribute to shared verification infrastructure, and coordinate cross-site communication to deliver high-quality CPU/ASIC verification.
Top Skills: C++LinuxPythonSystem Verilog

What you need to know about the Bristol Tech Scene

Along with Gloucester, Swindon and Bath, Bristol is part of the "Silicon Gorge" tech hub, a region in the U.K. renowned for its high-tech and research-driven industries, with a particular emphasis on sustainability and reducing environmental impact. As the European Green Capital, Bristol is home to 25,000 cleantech companies, including Baker Hughes and unicorn Ovo Energy. The city has committed to achieving net-zero emissions within the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account