(Senior) Developer - Data Engineering (all genders)
Holidu is one of the fastest-growing travel tech companies worldwide. Our mission is to make the booking and hosting of holiday homes free of doubt and full of joy. We help hosts to manage their vacation rental with ease and guests to find the perfect holiday home they truly enjoy. Our team of more than 600 colleagues from 60+ nations shares a passion for modern technologies, an ambition for constant improvement and the drive to bring the best experience to more than 40k hosts and 100 million website users each year. You want to achieve amazing milestones with us? Then pack your bag, hop on board and get ready for take off!
Your future team
Our Tech Stack
- We use Airflow as a scheduler and Data Build Tool (dbt-core) to craft efficient data pipelines with an extensively customized setup, leveraging the flexibility of Python.
- Utilizing Redshift, Athena, and DuckDB, we interact with data seamlessly.
- Our Redshift setup utilizes a decentralized structure through data sharing, enhancing flexibility and scalability.
- The everyday tasks associated with data warehousing are effectively automated through the proficient application of the Spring Boot CLI, thereby streamlining operations and ensuring operational efficiency.
- We utilize AI (LLM) tools extensively to enhance productivity. (Claude, Copilot, Codex, MCP Servers, Agentic systems, or tool of your choice).
- Our tech arsenal includes Terraform, Docker, and Jenkins, forming a robust foundation for cloud-based development and seamless DevOps integration.
- Monitoring is a top priority, and we employ ELK, Grafana, Looker, OpsGenie, and internally developed technologies to ensure real-time visibility into system performance and data workflows.
- Build and maintain Java microservices that collect user tracking data from our websites and mobile applications via REST APIs, using Kafka as the central message broker.
- Deploy applications on AWS EKS (Kubernetes), ensuring high availability, scalability, and efficient resource management.
- Develop Java/Kotlin SDKs that enable backend development teams to send user event data to Kafka and AWS Firehose seamlessly.
- Architect event-driven data collection systems processing millions of user interactions daily from across our digital properties on systems handling millions of events daily with sub-second latency requirements.
- Configure and optimize third-party ingestion tools (Airbyte, Fivetran) to extract data from external APIs, databases, and SaaS platforms.
Your role in this journey
- Play a key role in a high-performing cross-functional team with a strong focus on data products, velocity, quality and engineering culture.
- Engage comprehensively in software development - from ideation to release and monitoring.
- Architect, design, and optimize robust data pipelines to extract, transform, and load diverse datasets efficiently and reliably.
- Devote efforts to crafting practical development toolchains to empower other teams in the organization to build high-quality datasets.
- Shape engineering objectives in tandem with engineering managers.
- Passionate about the latest data technologies? You'll research new solutions, optimize our existing stack, and pioneer innovations that push our platform forward.
- Seek cost-efficient solutions while maintaining high standards of quality (SLA).
- Proactively identify opportunities and drive initiatives to enhance data engineering processes, monitor pipeline performance, and identify opportunities for optimization and efficiency gains.
- Focus on the growth of your team by offering consistent and valuable feedback and recruiting new engineers.
- Apply best practices and continuously learn and share your thoughts with your peers.
- Support Data Analysts in migrating from Airflow PostgresOperator (Redshift) to DBT using Athena.
- Evaluating new implementations constantly with cost, productivity, and velocity in mind.
- Challenge the status quo, proposing solutions that will have a company impact.
Your backpack is filled with
- A bachelor's degree in Computer Science, a related technical field, or equivalent practical experience.
- Experience building and implementing Lakehouse architectures in AWS or other similar setups.
- Effectively build batch and streaming data pipelines using technologies such as Airflow, DBT, Redshift, Athena/Presto, Firehose, Spark, SQL databases, and similar technologies.
- Strong knowledge of how distributed systems work.
- DataOps knowledge (e.g., Infrastructure as Code, CI/CD for data pipelines, automated testing, monitoring/observability implementation, data quality management methodologies).
- Desire to learn and use cutting-edge LLM tools and agents to improve your and the entire team's productivity.
- Strong programming skills in Python, SQL or similar languages.
- Experience with containerization and orchestration technologies (Docker, Kubernetes/EKS).
- Previous experience in Data Governance is highly desirable.
- Excellent communication skills with the ability to influence stakeholders and align technical solutions with business objectives.
Our adventure includes
- Impact: Make a difference for hundreds of thousands of monthly users.
- Growth: Take responsibility from day one and develop through regular feedback, workshops, and knowledge exchanges.
- Personal Development: Use your learning budget and 2 extra study days for conferences, books, courses, and more.
- Community: Engage with international, diverse, yet like-minded colleagues through regular events and 2 office days per week with your team.
- Flexibility: Benefit from our hybrid work policy and the chance to work from other local offices for up to 8 weeks a year.
- Fitness: Get a premium gym membership at a discounted rate.
- Travel: Enjoy 28 vacation days + 13 public holidays in Bavaria and the possibility to take up to 10 unpaid vacation days with special discounts on our Holidu Homes properties.
Want to travel with us?
Office Life
Get in touch!
Do you have questions about a role or want to learn more about us? Send us an e-mail and we will get back to you soon!
Send an e-mail