Blankfactor Logo

Blankfactor

Senior Data Engineer

Job Posted 6 Days Ago Posted 6 Days Ago
Be an Early Applicant
Sofia, Sofia-grad
Senior level
Sofia, Sofia-grad
Senior level
Develop and optimize data pipelines and analytics solutions for high volumes of transaction data, ensuring data reliability and security while collaborating with cross-functional teams.
The summary above was generated by AI

What we do 

At Blankfactor, we are dedicated to engineering impact. We are passionate about creating value by building best-in-class tech solutions for companies looking to transform, innovate, and scale. In every project, we aim to deliver work that moves the needle and drives measurable outcomes for our partners and clients. Our full-stack development, data engineering, digital product, and enterprise AI solutions cater to a range of industries, including payments, banking, capital markets, and life sciences.

We are headquartered in Miami, Florida, have offices in Bulgaria, Colombia, and Romania, and are rapidly expanding our global footprint. Our culture of engineering excellence, technical expertise, and care for both our clients and our talented workforce has made us one of the fastest-growing companies in America.

We only hire the best and brightest. If you have talent and ambition, join us and be part of an environment that fosters innovation, collaboration, and growth. Welcome to Blankfactor!

What to expect in this role

As a Senior Data Engineer, you will play a key role in developing and optimizing data pipelines, ETL processes, and analytics solutions that handle large volumes of transaction data in real-time. You will collaborate closely with product, engineering, and compliance teams to ensure data reliability, accuracy, and security. This is an exciting opportunity to work on mission-critical systems in the financial industry and leverage your skills in big data, cloud infrastructure, and modern data engineering practices.

  • Design, build, and maintain scalable and efficient ETL/ELT pipelines to process high volumes of financial transaction data from various sources including card networks, partner banks, and APIs2.

  • Ensure seamless integration of structured and unstructured data from multiple sources (issuer, card networks, payment platforms) into centralized data warehouses and data lakes.

  • Implement real-time data streaming and processing systems to handle large-scale transactions and events using tools like Kafka, Kinesis, Spark etc.

  • Manage and optimize data storage solutions (e.g., Snowflake, Redshift) to ensure efficient querying and reporting.

  • Implement robust data validation, error handling, and auditing mechanisms to ensure the accuracy and integrity of financial data. 

  • Collaborate with compliance and security teams to ensure data governance standards are met.

  • Implement monitoring, alerting, and logging for data pipelines and systems to ensure high availability and performance.

  • Ensure all data handling meets stringent security and compliance standards (PCI DSS, GDPR, etc.) required in our business

Our stack:

  • Cloud platform - AWS

  • Execution engine – Glue (Spark)

  • Real-time solution – MSK/Kinesis

  • Data discovery/observability – DBT

  • Data Orchestration – Apache Airflow

  • Data Warehouse – TBC

  • IasC – Terraform/Terramate

  • CI/CD – Bitbucket pipelines

Requirements and technical skills

  • 5+ years of proven experience as a Senior Data Engineer preferably in fintech, or payment industry (though domain is not a deal-breaker).

  • Advanced knowledge of Python and (No)SQL.

  • Experience with modern data warehousing technologies (Snowflake, Redshift, or similar).

  • A solid understanding and hands-on experience(preferably) with real-time and event-driven systems such as Kafka, Kinesis or similar.

  • Strong experience with ETL/ELT pipeline development using tools such as AWS Glue, Apache Airflow, DBT, etc

  • Working knowledge of Terraform for Infrastructure as Code.

  • Experience with distributed computing and big data technologies like Spark.

  • Experience with cloud platforms such as AWS, GCP or Azure (preferably AWS) and services.

  • Working knowledge of containerization (Docker).

  • Strong communication skills to collaborate with cross-functional teams and translate business needs into technical solutions.

  • Ability to troubleshoot and resolve complex data issues in high-pressure environments.

  • Ability to work with large data sets and support insight developments that drive business decisions.

  • Bachelor's or Master’s degree in computer science, Information

  • Systems, Data Science, Engineering or a related field.

What you can expect as a member of the Blankfactor team

  • Fintech Expertise: Access to expertise in machine learning, data science, big data, and AI, providing opportunities for continuous learning and exposure to cutting-edge technologies.

  • Technology exams/Certifications covered by the company

  • World-class workspace for unleashing creativity

  • Lunch is provided when working from the office 

  • Fresh fruits and snacks in the office 

  • Diverse client portfolio

  • Cutting-edge high-tech stack 

  • Monthly on-site gatherings

  • Annual festivities: Participate in team-building activities, family BBQs, and end-of-year celebrations

  • Participation in Sporting Challenges and Marathons

  • Voluntary social events 

We believe that diversity of experience and background contributes to more robust ideas and a stronger team. All qualified applicants will receive consideration for employment without regard to religion, race, sex, sexual orientation, gender identity, national origin, or disability.

Top Skills

Apache Airflow
AWS
Dbt
Docker
Glue
Kafka
Kinesis
NoSQL
Python
Redshift
Snowflake
Spark
SQL
Terraform

Similar Jobs

3 Days Ago
2 Locations
Senior level
Senior level
Information Technology
The Senior Azure Data Engineer will analyze, design, and implement data warehouses and pipelines while working with Azure technologies to deliver insights for clients.
Top Skills: SparkArm TemplatesAzure Data BricksAzure Data FactoryAzure DevopsConfluenceGitJIRAPythonSQLTerraform
3 Days Ago
2 Locations
Junior
Junior
Information Technology
Analyze, design, and implement data warehouses and pipelines using SQL and cloud technologies. Join a growing Analytics Data Engineering team.
Top Skills: SparkAws RedshiftAzure Data FactoryAzure SynapseDatabricksDwhETLPythonSnowflakeSQL
Mid level
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
The role involves managing Segregation of Duties, SOX controls, and enhancing internal control frameworks while conducting big data analyses and providing training.
Top Skills: Big Data AnalysesMS OfficePower BISap Grc AcSap S4/HanaSQLTableauVBA

What you need to know about the Bristol Tech Scene

Along with Gloucester, Swindon and Bath, Bristol is part of the "Silicon Gorge" tech hub, a region in the U.K. renowned for its high-tech and research-driven industries, with a particular emphasis on sustainability and reducing environmental impact. As the European Green Capital, Bristol is home to 25,000 cleantech companies, including Baker Hughes and unicorn Ovo Energy. The city has committed to achieving net-zero emissions within the next decade.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account