Kaluza Logo

Kaluza

Data Engineer

Posted 2 Hours Ago
Be an Early Applicant
In-Office
Bristol, England, GBR
Mid level
In-Office
Bristol, England, GBR
Mid level
As a Data Engineer, you'll develop data pipelines, maintain cloud infrastructure, implement ETL processes, and collaborate with teams to ensure data quality and delivery.
The summary above was generated by AI

Job title: Data Engineer
Location: London / Bristol / Edinburgh (Hybrid)
Part time available: No
Salary: £46,400 - £58,000
Team: Data 
Reporting To: Engineering Manager - Data Platform

This role is based in the London, Bristol or Edinburgh and requires existing right to work in the UK. 
At this time, we are not able to offer visa sponsorship for this role. We are committed to building a diverse, global team and our sponsorship policy is evaluated on a role-by-role basis. We encourage you to keep an eye on our careers site to stay informed about future opportunities where we are able to offer visa sponsorship.

Kaluza is the Energy Intelligence Platform, turning energy complexity into seamless coordination. We help energy companies overcome today’s challenges while accelerating the shift to a clean, electrified future.

Our platform orchestrates millions of real-time decisions across homes, devices, markets and grids. By combining predictive algorithms with human-centred design, Kaluza makes clean energy dependable, affordable and adaptive to everyday life.

With teams across Europe, North America, Asia and Australia, and a joint venture with Mitsubishi Corporation in Japan, we power leading companies including OVO, AGL and ENGIE, as well as innovators like Volvo and Volkswagen.

What will I be doing?

As a Data Engineer, you will be responsible for setting up our Data Platform and delivering reliable, scalable data solutions that meet client requirements. You will work closely with stakeholders to understand business needs, translate them into technical data pipelines and architectures, and ensure data is accurate, accessible, and delivered on time.

Responsibilities include:

  • Pipeline Development: Drive the design and development of new data models and production-level pipelines to ingest, transform, and deliver data from multiple sources.
    Infrastructure & CI/CD: Maintain data platform infrastructure via IaC tools and develop CI/CD workflows to support the deployment and optimization of data workflows.
  • Data Quality & ETL: Implement and maintain ETL/ELT processes to transform raw data into structured, analytics-ready datasets while ensuring data quality and performance.
  • Collaboration: Partner with data analysts, data scientists, and engineering teams to support reporting, analytics, and machine learning use cases.
  • Monitoring & Support: Monitor pipeline performance and troubleshoot issues to ensure reliable and timely data delivery.
  • Architecture Improvement: Contribute to improving data platform architecture, scalability, and best practices.

About You
Ideally you’ll have/be

  • Technical Experience: Previous experience as a Data Engineer or in a similar data-focused role, specifically working in data platform teams that maintain high-availability pipelines.
  • Cloud & Infrastructure: Hands-on experience maintaining cloud infrastructure (preferably AWS/GCP) through IaC tools, specifically Terraform.
  • DevOps Mindset: Experience maintaining CI/CD workflows, with a preference for GitHub Actions.
  • Data Design: Proficiency in designing and implementing data models, master data management, and data quality rules.
  • Collaborative Nature: You enjoy working across multiple teams to turn business requirements into actionable data solutions.
  • Modern Stack Proficiency: Experience with SQL, Python, and data orchestration tooling such as dataform or dbt.

What will set you apart

  • Databricks Expertise: Previous experience working with Databricks is highly valued.
  • Advanced Techniques: Possessing both theoretical knowledge and practical, hands-on experience in using advanced analytical and analytic engineering techniques.
  • Platform Vision: The ability to contribute meaningfully to the evolution of data platform architecture and scalability standards.

Kaluza Values
Here at Kaluza we have five core values that guide us as a business: Play to win, Solve the real problem, Build trust every day, Own the outcome, Go further together.

Our Perks

  • Pension Scheme
  • Discretionary Bonus Scheme
  • Private Medical Insurance + Virtual GP
  • Life Assurance
  • Access to Furthr - a Climate Action app
  • Free Mortgage Advice and Eye Tests
  • Perks at Work - access to thousands of retail discounts
  • 5% Flex Fund - to spend on the benefits you want most
  • 26 days holiday
  • Flexible Bank Holidays - Giving you an additional 8 days which you can choose to take whenever you like
  • Progressive Leave Policy - With no qualifying service periods, including 26 weeks full pay if you have a new addition to your family
  • Personal Learning Budget - Upskill, learn and grow!
  • Home Office Budget
  • And more!

We want the best people

We’re keen to meet people from all walks of life — our view is that the more inclusive we are, the better our work will be. We want to build teams which represent a variety of experiences, perspectives and skills, and we recognise talent on the basis of merit and potential.

We understand some people may not apply for jobs unless they tick every box. But if you're excited about joining us and think you have some of what we're looking for, even if you're not 100% sure, we'd still love to hear from you.

Find out more about working in Kaluza on our careers page and LinkedIn. 

You can also find our Applicant Data Protection Policy here.

Top Skills

AWS
Dataform
Dbt
GCP
Github Actions
Python
SQL
Terraform

Similar Jobs

19 Days Ago
Hybrid
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Design and build scalable, secure data pipelines (streaming and batch), drive CI/CD and automation, participate in cloud migration and architecture discussions, collaborate with clients, and mentor internal teams to deliver production-grade data solutions.
Top Skills: AirflowAWSBigQueryCircleCIContainerisationEltETLGraph DatabasesHadoopJavaJenkinsKafkaKinesisPythonRedshiftScalaSnowflakeSparkSpark StreamingSQL
2 Hours Ago
In-Office
Mid level
Mid level
Financial Services
The Data Engineer will build and manage data pipelines, transform data into high-quality products, and collaborate with data teams to ensure effective data management.
Top Skills: Azure Data Engineering ToolsAzure SqlData FactoryDatabricksPythonSparkSQLT-Sql
4 Hours Ago
In-Office
Mid level
Mid level
Aerospace • Security • Energy • Defense
Design and build data pipelines, manage data ingestion, transform data, develop analytics and AI/ML models, and ensure data security and governance.
Top Skills: Azure Analysis ServicesAzure Data BricksAzure Data FactoryAzure Data LakeAzure Logic AppsAzure Sql DatabaseData PipelinesETLHadoopJavaPower BIPythonScalaSpark

What you need to know about the Bristol Tech Scene

Along with Gloucester, Swindon and Bath, Bristol is part of the "Silicon Gorge" tech hub, a region in the U.K. renowned for its high-tech and research-driven industries, with a particular emphasis on sustainability and reducing environmental impact. As the European Green Capital, Bristol is home to 25,000 cleantech companies, including Baker Hughes and unicorn Ovo Energy. The city has committed to achieving net-zero emissions within the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account