Your Role | Your Profile | Our Offer | The Benefits | The Process | Apply now!

Your Role

At data is a huge part of what we are offering to our customers. Real-time stock market data, directly streamed from the stock exchange, historical-data sets aggregated on different time frames and many more things. Therefore, the core driver in our infrastructure are Apache Kafka and other scalable connectors around the Kafka ecosystem to serve our customers with all the data they need.

We typically process 25000+ messages per second with Kafka on a normal day, hence building a reliable and scalable infrastructure for low latency, able to deal with peaks is a core part to achieve our mission.

As described we not only stream the data, but also want to aggregate and store. So we have to combine multiple connectors with the core pipeline, e.g. for persisting the entire data in scalable databases or streaming data through websockets to our customers.

You will also help us in building, maintaining and improving a data bus for internal purposes which are event-driven services. For instance, if you wire money to, an event is triggered, sent through our data bus Kafka and on the other end received by another service processing and using it. As you might have guessed already, processing customers money needs more effort on the infrastructure end, so that will be also one of your daily challenges.

You will always work closely together with our backend and AWS infrastructure team, as you are building the important bridge between our customers and partners and maintain our most critical infrastructure.

Your Profile

—> We are more than happy to be proven wrong. Simply apply if the job sounds good to you and we will have an honest conversation about you and the way you think about infrastructure engineering.

Our Offer