Contents

The Future of Real-time Data - Lunch & Learn with Confluent and SoftwareMill

The Future of Real-time Data - Lunch & Learn with Confluent and SoftwareMill webp image

As more and more institutions and enterprises shift toward streaming-first platforms, the challenges of designing scalable, fault-tolerant, and low-latency systems are more relevant than ever. That’s why, once again, we’ve joined forces with Confluent, and this time also partnering with PKO Bank Polski, to host another Lunch & Learn event to discuss the future of real-time data and its impact on financial innovation.

In the wonderful PKO BP venue in the center of Warsaw, we gathered experts and practitioners working with event-driven architectures, real-time analytics, and modern data platforms, both from Confluent and SoftwareMill and organized a learning session with real-life examples of using Confluent's powerful data streaming tools.

A Deep Dive into Streaming Infrastructure

We packed the agenda with first-hand experience and live insights.

The event kicked off with a short but insightful introduction by our CTO, Michał Matłoka, and Marco Marescia from Confluent, outlining the strategic and technical foundation of our partnership, rooted in helping organizations across industries leverage Kafka-powered solutions to build faster, more resilient data platforms.

introduction

Confluent's Ruben Catarrunas started the learning part by providing an overview of the Confluent Data Streaming Platform's latest capabilities. He highlighted enhanced governance and security features designed for large-scale deployments. To provide more context and value, he shared interesting real-life examples of how these features have been implemented in financial institutions.

Ruben Catarrunas

Breaking the Scalability Bottleneck

Later, we explored a case study of solving performance issues in a high-capacity enterprise-grade system. Grzegorz Kocur and Krzysztof Ciesielski, our Principal Engineers, discussed how they analyzed the architecture and showed a step-by-step process of migrating it into a new design free of bottlenecks, but also better adjusted to handle multiple types of tenants.

Kafka is often a great choice for handling large-scale traffic of messages in multi-tenant systems. It’s often difficult to design their architecture optimally up front, there’s always some residual unknown unknowns. Overlooked details may surface only after a cluster crosses very high size and traffic. For example, a consumer group depends on a single coordinator, which may lead to various performance bottlenecks under heavy load.

Fortunately, Kafka is exceptionally flexible, especially when it comes to consuming patterns and re-reading messages. This gives many opportunities to optimize handling of different tenant groups or special tenants, while their data still flows on production in real-time, even on very large clusters.

Krzysztof Ciesielski, Principal Software Engineer & Head of Security, SoftwareMill

SoftwareMill

Last week marked our third opportunity to co-host a Lunch & Learn event in Warsaw with Confluent. This time, we partnered with PKO Bank Polski and had the pleasure of hosting the event at their fantastic venue in the heart of the city, complete with a great view.

These events are more than just a chance to hear insightful presentations - they're also a valuable opportunity to connect with customers and engage in conversations with other companies about their Kafka-related challenges.

We enjoyed two excellent presentations: one focused on the Confluent Platform and another showcasing a technical use case. Later in the evening, we also co-hosted the Apache Kafka Meetup Warsaw at the Snowflake office, featuring talks from Snowflake, Confluent, and SoftwareMill.

We're looking forward to future editions and deepening our collaboration with Confluent even further.

Michał Matłoka, CTO, SoftwareMill

Real-Time Data as a Driver of Financial Innovation

From fraud prevention and transaction monitoring to dynamic pricing or personalized services, real-time data processing is becoming the backbone of modern banking. Through streaming platforms like Apache Kafka and Confluent, financial institutions are releasing new capabilities to act upon data in motion rather than hours or days later. This evolution enables not only faster decisions but deeper customer experiences and smarter automation.

As a Confluent Premium Partner, we help customers take these capabilities to production-quality systems with the security, compliance, and scalability requirements in mind.

Interested in harnessing the power of Kafka for your business? We’re here to help! Let's talk.

kafka ebook

Blog Comments powered by Disqus.