Data Engineer
Warsaw, remote
B2B
Haptiq is a global technology company dedicated to delivering best in class software and digital solutions engineered to drive profitable and scalable growth for our clients.

Operating as the nexus of portfolio management, growth, and optimization, Haptiq offers SaaS platforms fueled by cutting-edge AI, a comprehensive end-to-end data management platform, and a suite of strategic services.

Among our flagship offerings is Olympus FinTech, a dynamic solution empowering private equity, debt, and CLO managers with customizable and streamlined workflows, robust data management, and sophisticated reporting capabilities.

Pantheon by Haptiq is a platform that helps create interactive, personalized virtual experiences. It uses AI to make content more engaging and customizable, making it useful for entertainment, marketing, and education. Users can easily build and manage these immersive experiences in real-time.

About the role

We are seeking a motivated and self-driven Data Engineer to join our dynamic data team. You will play a key role in designing, building, and maintaining ETL infrastructure and data pipelines, ensuring data quality and scalability.



Responsibilities:


  • Develop and optimize ETL pipelines for efficient data ingestion, transformation, and loading.
  • Design, build, and deploy Python scripts and ETL processes using ADF.
  • Work with structured, semi-structured, and unstructured data across diverse sources.
  • Implement dimensional data modeling and data warehousing concepts (OLTP, OLAP, Facts, Dimensions).
  • Ensure best practices in data management, security, and governance within cloud environments.
  • Troubleshoot and optimize ADF jobs and ETL workflows for performance and scalability.
  • Perform code reviews, manage version control (GitHub), and deploy via CI/CD pipelines.
  • Collaborate on cloud-based architectures and enterprise-wide data migrations.

Requirements:


  • 3+ years in Python coding.
  • 5+ years in SQL Server development and large datasets.
  • Proficiency in developing and deploying ETL pipelines using Databricks and PySpark.
  • Expertise in cloud data warehouses like Synapse, Redshift, Snowflake, or ADF.
  • Knowledge of event-based/streaming data ingestion and processing.
  • Strong skills in SQL, Python, data modeling, and dimensional design.
  • Hands-on experience with cloud architectures and messaging systems.
  • Familiarity with CI/CD workflows and deployment processes.
  • Certifications: Cloud certifications are a plus.
  • Experience with Airflow, AWS Lambda, Glue, and Step Functions will be your advantage.

Benefits

  • Work in a highly professional team. Informal and friendly atmosphere in the team.
  • Ability to work from our comfortable office in Warsaw at Prosta str. 51
  • Paid vacation — 20 business days per year, 100% sick leave payment
  • 5 sick days per year
  • Equipment provision
  • Medical insurance (after the end of the probationary period)
  • Partially compensated educational costs (for courses, certifications, professional events, etc.)
  • Legal and Accounting support
    in Poland
  • English and Polish classes 2 times
    a week (online)
  • Bright and memorable corporate life: corporate parties, gifts to employees on significant dates
Join Haptiq
If you'd like to work from Poland, having a PBH visa/Karta pobytu/Paszport Polski is obligatory to be considered for this position. Thank you!
Tatyana Kyashkina
Recruiter
tatsiana.kiashkina@onthespotdev.com