N-iX Logo

N-iX

Junior Strong Data Engineer (Databricks)

Posted 12 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in Poland
Junior
Remote
Hiring Remotely in Poland
Junior
Build and maintain Python-based data pipelines on Databricks, collaborate with product and domain experts, debug Tier-2 system issues, create documentation and onboarding processes, write integration tests, manage code in GitLab, and improve data quality and self-service tooling.
The summary above was generated by AI

Join our team to work on enhancing a robust data pipeline that powers our SaaS product, ensuring seamless contextualization, validation, and ingestion of customer data. Collaborate with product teams to unlock new user experiences by leveraging data insights. Engage with domain experts to analyze real-world engineering data and build data quality solutions that inspire customer confidence. Additionally, identify opportunities to develop self-service tools that streamline data onboarding and make it more accessible for our users.

Our Client was established with the mission to fundamentally transform the execution of capital projects and operations. Designed by industry experts for industry experts, our Client’s platform empowers users to digitally search, visualize, navigate, and collaborate on assets. Drawing on 30 years of software expertise and 180 years of industrial legacy as part of the renowned Scandinavian business group, Client plays an active role in advancing the global energy transition. The company operates from Norway, the UK, and the U.S.

Key Responsibilities:

  • Design, build, and maintain data pipelines using Python.
  • Collaborate with an international team to develop scalable data solutions.
  • Conduct in-depth analysis and debugging of system bugs (Tier 2).
  • Develop and maintain smart documentation for process consistency, including the creation and refinement of checklists and workflows.
  • Set up and configure new tenants, collaborating closely with team members to ensure smooth onboarding.
  • Write integration tests to ensure the quality and reliability of data services.
  • Work with Gitlab to manage code and collaborate with team members.
  • Utilize Databricks for data processing and management.

Requirements:

  • Programming: Minimum of 1-2 years as data/software engineer, or in a relevant field.
  • Python working knowledge: Coding experience in Python, particularly in delivering/maintaining data pipelines and troubleshooting code-based bugs. Experience working with large codebases, working with an IDE and Git.
  • Data Skills: Structured approach to data insights and diagnostic skills for data related issues.
  • Cloud: Familiarity with cloud platforms (preferably Azure).
  • Data Platforms: Knowledge of Databricks, Snowflake, or similar data platforms.
  • Database Skills: Knowledge of relational databases, and working experience with SQL.
  • Big Data: Experience using Apache Spark/PySpark is a plus.
  • Documentation: Experience in creating and maintaining structured documentation.
  • Testing: Proficiency in utilizing testing frameworks (pytest) to ensure code reliability and maintainability.
  • Version Control: Experience with Git and Gitlab or equivalent.
  • English Proficiency: B2 level plus.
  • Interpersonal Skills: Strong collaboration abilities, willing to learn new skills and tools, adaptive and exploring mindset. We're looking for candidates that do not fear to reach out to others.

Nice to have:

  • Knowledge of Docker and Kubernetes.
  • Ability to travel abroad once/twice a year for an on-site team building (Oslo, Norway).

We offer*:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits

*not applicable for freelancers

Top Skills

Python,Databricks,Snowflake,Sql,Apache Spark,Pyspark,Azure,Pytest,Git,Gitlab,Docker,Kubernetes

Similar Jobs

2 Hours Ago
Remote or Hybrid
Gdańsk, Pomorskie, POL
Senior level
Senior level
Artificial Intelligence • Big Data • Cloud • Information Technology • Software • Big Data Analytics • Automation
Lead a distributed team of 5-15 engineers to deliver developer experience tools, manage timelines, foster team growth, and align product delivery with company strategy.
Top Skills: Ai-Driven DevelopmentNode.jsReactTypescript
5 Hours Ago
Remote or Hybrid
Poland
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As an IT Strategy Consultant, you will help financial services organizations define technology strategies, support transformation programs, and align tech with business priorities.
Top Skills: CloudDataEnterprise ArchitectureIt Governance
5 Hours Ago
Remote or Hybrid
Kraków, Małopolskie, POL
Senior level
Senior level
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
As a Senior Software Engineer, you'll develop SAP applications, focusing on SAP ERP components and ensure software quality and excellence through various development processes and mentorship.
Top Skills: Ecc6.0Eclipse/AdtS/4HanaSap Abap

What you need to know about the Bristol Tech Scene

Along with Gloucester, Swindon and Bath, Bristol is part of the "Silicon Gorge" tech hub, a region in the U.K. renowned for its high-tech and research-driven industries, with a particular emphasis on sustainability and reducing environmental impact. As the European Green Capital, Bristol is home to 25,000 cleantech companies, including Baker Hughes and unicorn Ovo Energy. The city has committed to achieving net-zero emissions within the next decade.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account