Hello, let’s meet!
Who We Are
While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started.
What We Do
We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data and AI solutions, and cutting-edge applications to shape the future of tech. Our clients include McLaren, Aviva, Deloitte, Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, Allegro, InPost, and many, many more.
We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland!
Beyond Projects
What makes Xebia special? Our community. We support tech communities, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow.
What sets us apart?
Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself.
- developing and enhancing real-time streaming pipelines using Apache Flink,
- migrating existing Flink jobs using the DataStream API and adapting them to newer platform standards,
- leading and executing the upgrade of the Flink platform to version 2.0,
- designing, optimizing, and maintaining high-throughput, fault-tolerant streaming architectures,
- migrating large-scale datasets from BigQuery (BQ) to Data Cloud Storage (DCS),
- scaling and automating ongoing data migration processes to support growing data volumes,
- converting datasets from Avro to Parquet format, with attention to performance, schema evolution, and storage optimization,
- leveraging AI-powered tools to accelerate migration, validation, and transformation workflows,
- ensuring data quality, integrity, and minimal downtime during migrations,
- collaborating with cross-functional teams and clearly communicating technical concepts to non-technical stakeholders.
- strong hands-on experience with Apache Flink, including development using the DataStream API,
- proven experience maintaining and upgrading Flink environments, ideally including exposure to Flink 2.0,
- deep understanding of streaming pipeline architecture, performance tuning, state management, and fault tolerance,
- experience migrating large-scale datasets from BigQuery (BQ) to Data Cloud Storage (DCS),
- strong proficiency in data format conversion, particularly Avro to Parquet,
- ability to design, scale, and automate migration workflows while ensuring data integrity and minimal service disruption,
- solid knowledge of Google Cloud Platform (GCP) and its data services,
- good understanding of distributed systems, schema evolution, and storage optimization strategies,
- ability to break down complex migration and platform challenges into clear, actionable steps,
- proactive mindset with strong ownership of solutions and risk identification,
- clear and effective communication skills, especially when explaining technical topics to non-technical stakeholders,
understanding of how machine learning or intelligent automation can be applied to optimize and monitor data workflows,
- practical experience using AI-powered assistants (e.g. Claude Code, GitHub Copilot, Cursor) to improve productivity, quality, or decision-making in software delivery.
Work from the European Union region and a work permit are required.
Nice to have:- experience working on high-scale, consumer-facing data platforms,
- background in long-running migration programs involving multiple data sources and formats,
- familiarity with observability, monitoring, and alerting for streaming systems,
- interest in and familiarity with emerging AI-driven practices (e.g. agent-based workflows, automation patterns, AI-augmented development), with a willingness to explore and experiment beyond standard approaches.
Recruitment Process:
CV review – HR call – Interview – Client Interview – Decision



