Profitero+ Logo

Profitero+

Data Analytics Engineer

Posted Yesterday
Be an Early Applicant
Hybrid
Wokingham, Berkshire, England
Junior
Hybrid
Wokingham, Berkshire, England
Junior
Join the analytics team to develop data pipelines and analytical services for eCommerce insights using Snowflake and Python. Responsibilities include designing, building, and optimizing data solutions, and collaborating with BI and architecture teams.
The summary above was generated by AI

About Profitero

Profitero is a leading global SaaS commerce platform that uses predictive intelligence to help brands anticipate, activate and automate their next best action to fuel profitable growth. Our technology monitors 80+ million products daily, across 1400+ retailers and 70+ countries, helping brands optimise search placement, product content, pricing, stock availability, reviews and more. News outlets, including Good Morning America, The Wall Street Journal and Ad Age frequently cite and trust Profitero as a source of data for their stories. Now’s an exciting time to join our fast-growth business.


Profitero+  joined Publicis Groupe (a $13 billion global marketing services and technology company) as a standalone commerce division, infusing our business with significant product development resources and investment while giving our employees an incredible launchpad for their careers. Profitero’s tech and data combined with Publicis’ tech, data and activation services positions us to be a true end-to-end partner for helping brands maximise eCommerce market share and profits.


Come be a part of our fast-paced, entrepreneurial culture and next stage of growth.

Location: Winnersh Triangle, Reading


Overview

We are inviting a Data Engineer to join our Analytics team that is creating a next-generation eCommerce intelligence service for retailers and manufacturers. We will be happy to welcome new team members who are not afraid of non-trivial tasks, can offer non-standard technical solutions, and take initiative.

About the role:

We are developing a new portfolio of analytical services to allow our customers to analyse eCommerce data. The process behind this development includes collection, processing, and presenting a big amount of data to the customer. It is developed using Snowflake, Sigma, and Python. Right now the data is hosted in GBQ, but we are in the process of migration to Snowflake.

Profitero provides customers with data on how products perform online from various angles: prices, availability, placement, ratings and reviews, product content, etc. Customers can use dashboards, our web application or API connection to GBQ to get information about products performance online. 

The migration of existing services to Snowflake will cause us to rewrite certain data pipelines to adjust our client dashboards to the new technology stack. The data engineer in this team will be responsible for implementing those pipelines working with the BI team and Architecture team.

Responsibilities:

  • Design and build scalable and resilient Data & Analytics solutions
  • Automate data workflows and optimize data processing for performance and cost
  • Design and development of new data pipelines. Improvement of existing data pipelines by using data engineering best practices.
  • Design and develop efficient, scalable data models that enable fast and accurate reporting while minimizing cost and query complexity
  • Monitor and optimize data warehouse costs, leveraging Snowflake's cost management tools to ensure efficient data processing and storage usage
  • Engage in proof of concepts and experiments
  • Participate in overall testing and production maintenance
  • Work closely with the BI Team and Architecture team to ensure smooth transition from GBQ to Snowflake

Who you are:

  • Bachelor's degree in Computer Science, Data Engineering, Data Science, Software Engineering or similar. A master’s degree is a plus.
  • 1+ years of related work experience

Technical skills:

  • Strong knowledge of Python and SQL with a minimum of 1+ years of practical experience in data automation. Ability to write efficient and scalable code for data processing.
  • Hands-on experience with Snowflake, including performance tuning (cloud cost efficiency, data volume handling), understanding cost management, and data sharing features. Knowledge of GBQ is a plus.
  • Solid understanding of designing and optimising data pipelines and data models in a cloud environment. Experience in setting up ETL processes, knowledge of dbt is a plus.
  • Expertise in working with scalable data architectures, including data warehouses, data lakes, and data pipelines.
  • Experience with BI tools like Looker, Tableau, Sigma or similar is a plus.

Soft Skills:

  • Strong problem-solving skills with the ability to troubleshoot and resolve complex data issues.
  • Excellent communication and collaboration skills
  • Diligence and time management

We expect:

  • Ability to automate data workflows and optimize data processing for performance and cost.
  • Experience developing solutions in a Cloud environment
  • A natural interest in modern data processing technologies and the ability to learn new things
  • Passion to get into the development process quickly and deliver good quality code

Nice to have:

  • Working experience in an Agile environment (Jira, Asana, Confluence)
  • Understanding of distributed architecture features and challenges

The above lists are not exhaustive, and the job holder is required to undertake such duties as may reasonably be requested within the scope of the post.

Why you want to work at Profitero:

We want our employees to have an opportunity to share in the success that results from our dedication to service excellence, high-quality deliverables and an unparalleled client experience.

We hire only the best and provide the compensation and benefit programs appropriate for proven top-performing professionals. We want our employees to have an opportunity to share in the financial success that results from our dedication to service excellence, high-quality deliverables and an unparalleled client experience.

Our package include; competitive base salary; employee healthcare; life assurance; group income protection; dental care plan; eye care scheme; 24 hour on-line GP; company pension; cycle to work scheme; 25 days off + bank holidays + birthdays off; reduced gym membership; social events; employee referral scheme; personal development plans; Profitero Hero scheme; flexible working hours.

Profitero is committed to creating a diverse work environment and is proud to be an equal opportunity employer. All qualified applicants will receive fair consideration for employment. Profitero recruits, employs, trains, compensates and promotes regardless of race, religion, colour, national origin, sex, disability, age, veteran status, and other protected characteristics as required by applicable law.


Be part of a company on the forefront of eCommerce revolution where you will learn a lot and be a catalyst to turbo-charging your skills, experience and career.

Top Skills

Dbt
ETL
Gbq
Looker
Python
Sigma
Snowflake
SQL
Tableau

Similar Jobs

6 Days Ago
In-Office or Remote
11 Locations
Senior level
Senior level
Fintech • Software • Financial Services
As a Senior Data Engineer at Pennylane, you will model data, design ETL pipelines, and contribute to data governance and analytics platforms to support decision-making across the company.
Top Skills: Data Analytics PlatformData GovernanceData ModelingData WarehousingETLSQL
13 Days Ago
In-Office
3 Locations
Senior level
Senior level
Information Technology • Social Impact
Lead and mentor data engineering teams to design, build, and optimize scalable ETL/ELT pipelines and data architectures on cloud platforms, ensuring data quality, performance, and alignment with business needs.
Top Skills: Apache AirflowSparkAzureAzure Data FactoryCi/CdDockerPysparkPythonSpark SqlSQL
13 Days Ago
In-Office
3 Locations
Mid level
Mid level
Information Technology • Social Impact
Design, build and maintain ETL/ELT pipelines and data models using Python, SQL, Spark and Azure. Ensure data quality, governance and performance, create dashboards, and work with cross-functional teams to deliver scalable data solutions.
Top Skills: Python,Sql,Apache Spark,Pyspark,Spark Sql,Azure,Apache Airflow,Docker,Ci/Cd

What you need to know about the Bristol Tech Scene

Along with Gloucester, Swindon and Bath, Bristol is part of the "Silicon Gorge" tech hub, a region in the U.K. renowned for its high-tech and research-driven industries, with a particular emphasis on sustainability and reducing environmental impact. As the European Green Capital, Bristol is home to 25,000 cleantech companies, including Baker Hughes and unicorn Ovo Energy. The city has committed to achieving net-zero emissions within the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account