Data Engineer (Data Lakehouse & Infrastructure)
We are Up Coop with well-established positions in the field of prepaid corporate solutions. With 60+ years of experience in 25 countries on 4 continents we serve 23 million users from 1 million customers who enjoy products and services in 800,000 retail outlets.
Tombou Bulgaria Ltd. is a wholly owned subsidiary of Up Coop and a trusted market leader in employee benefits solutions in Bulgaria. For years, we have been supporting companies across the country by helping them motivate, engage, and retain their people through innovative and reliable benefit solutions.
Our success is driven by our team. We believe that when employees feel valued, supported, and empowered, they perform at their best. That’s why we invest not only in our products, but also in our people and workplace culture.
As the market evolves, so do we. Tombou Bulgaria is actively shaping the digital future of employee benefits in Bulgaria, with a strong focus on digital vouchers, electronic solutions, and modern platforms that make everyday work life easier for employers and employees alike.
About the Role:
We are currently building our next-generation data platform using a cutting-edge Data Lakehouse architecture. Our stack will be built on Apache Iceberg, Apache Spark, and Trino.
We are looking for a skilled Data Engineer to take ownership of this ecosystem. After the initial setup by our external consultants, you will become the primary expert responsible for maintaining, scaling, and evolving our data infrastructure to meet the growing needs of our business.
Key Responsibilities:
- Platform Ownership: Take over and manage the production environment involving Apache Iceberg, Spark, and Trino.
- Performance Tuning: Optimize query performance in Trino and manage Iceberg table maintenance (compaction, snapshot management, and partitioning strategies).
- Pipeline Development: Design, build, and scale robust ETL/ELT pipelines using Apache Spark.
- Architecture Evolution: Lead the transition from initial deployment to a fully matured data platform, implementing new features and integrations.
- Data Governance & Security: Ensure data integrity, schema evolution management, and access control across the Lakehouse.
- Collaboration: Act as the bridge between business stakeholders and the data infrastructure, ensuring analysts can run high-performance SQL queries.
Requirements:
- Big Data Expertise: Proven experience with Apache Spark for large-scale data processing.
- Advanced SQL: Exceptional SQL skills and experience with distributed SQL query engines (specifically Trino).
- Modern Table Formats: Hands-on experience or deep theoretical understanding of Apache Iceberg (or similar formats like Delta Lake/Hudi).
- Orchestration: Experience with workflow management tools like Apache Airflow.
- Experience with Data Modeling (Star Schema) for BI data source.
Nice to Have:
- Linux administration experience
- Experience with administration of any RDBMS (Relational Database Management System)
- Experience with dbt, Nessie (for data versioning), or Terraform
- Experience with Tableau and/or Power BI
What We Offer:
- The opportunity to work with one of the most advanced data stacks in the industry.
- A high degree of autonomy and the power to shape the company’s data strategy.
- A collaborative environment where technical excellence is valued.
- Competitive benefits package:
- ✓ Food vouchers
- ✓ Birthday voucher
- ✓ Anniversary voucher
- ✓ Additional health insurance
- ✓ 25 days paid annual leave
- ✓ Financial assistance for childbirth
- ✓ Referral bonus
- ✓ Fruity days
- Flexible home office policy
- Regular teambuilding activities and company CSR events
- Additional trainings and certifications on demand
- An excellent office location situated in the heart of the city, easily reachable via numerous public transportation choices.
Is this a good match for you? Apply now by sending your CV in English. We would be happy to get to know you.
Your personal data will be handled confidentially and processed only for recruitment purposes in line with GDPR requirements. Only short-listed candidates will be invited for an interview.
КАРТА С ОБЕКТИТЕ
КАЛКУЛАТОР ЗА СПЕСТЯВАНИЯ