Meet the Data Streaming Experts: Lunch & Learn in Warsaw meetup
In today's fast-paced digital landscape, the ability to obtain, process, and manage data in real-time has become a critical competitive advantage. Businesses of all sizes increasingly rely on data streaming to gain insights, enhance customer experiences, and make data-driven decisions. At the core of this data revolution are tools like Apache Kafka, a powerful, open-source platform that has transformed how organizations handle and process data streams.
We explored a few answers to data streaming challenges at the recent Confluent Lunch & Learn event in Warsaw. The main goal of our meeting was to show how businesses can leverage data in motion concept, eliminate the operational burden of self-managed Apache Kafka, and migrate to the cloud-native Confluent platform.
Our guests had the opportunity to witness firsthand the benefits of this transition. On the event's agenda was:
- A Data Mesh built on Event Streams presented by Confluent's Marcel Pellicero
- Schema Management as a Service with Confluent Cloud by Krzysztof Atłasik, SoftwareMill Senior Software Engineer
- Live Demo: Bringing Data in Motion with a fully managed Service for Apache Kafka in a hybrid world, again by Marcel Pellicero.
Overall, the event served as a compelling demonstration of how Confluent's cloud-native platform facilitates data management, aligning it with the real-time imperatives of today's business landscape. The possibilities like enhanced scalability, cost reduction, and understanding of data streaming were a subject of some fruitful discussions among participants.
Want to know more about Apache Kafka and its use cases? Read this article.
The role of Schema Management in Data Streaming
As a Confluent Premium Partner, we focus on adapting the Apache Kafka platform to the individual needs of the client and eliminating their pain points. That's why, this time, we chose to discuss one of the beneficial solutions we rely on - Schema Management.
During his presentation, Krzysztof Atłasik focused on the concept of Schema Management as a Service, which revolves around Schema Registry. It’s like a data organizer for Kafka users - a repository for managing and validating schemas for topic messages. It helps keep data in check, ensuring its quality and adherence to guidelines, makes collaboration easier, and enhances system performance. What’s more, with Confluent Cloud, you get managed Schema Registry out of the box. It helps avoid many potential configuration errors and improves operational performance.
Proper schema management is imperative in microservice-oriented architecture. On the other hand, handling a myriad of various schemas can be an underwhelming task. Schema registry automates registering and evolving schemas, as well as distributing correct schema versions between services. It enables organizations to enforce data consistency and improves data governance.
~ Krzysztof Atłasik, Senior Software Engineer
Read more: How to run Apache Kafka in the cloud?
Knowledge sharing and networking - a few more insights
It's not the first time that Confluent and our team have invited you to discussions with Apache Kafka experts. In the previous edition, our CTO Adam Warski talked from the technical side about incorporating Kafka into software architecture, overcoming challenges related to scalability, and the growing number of websocket clients. Adam also shared his impressions about this year's meeting:
A very pleasant surprise at the Kafka Lunch & Learn event was the turnout - almost everybody showed up, and we had to look for additional chairs! I think this shows the potential of the Polish market in terms of Kafka and Confluent Cloud deployments. We’ve had a couple of talks, both from a guest speaker from Confluent and our own Krzysztof Atłasik, who demoed how schemas can be managed and migrated (which is usually the hardest part!). But the most valuable part, as with most conferences, turned out to be the networking during the coffee breaks and lunch. Hopefully, that’s the start of a community which will meet more frequently!
~ Adam Warski, CTO SoftwareMill
As our developers know Kafka's ins and outs very well, and the demand for reliable data management tools is growing, we feel that education in this field is really important.
We would like to thank both the participants and our partners from Confluent for such a successful event. We hope that this will soon develop into further successful Kafka&Confluent implementations in our customers' environments!
~ Marcin Głasek, Business Developer
We hope that all participants spent valuable time at our meeting. We are glad that after the presentations, there were a lot of exciting questions and worthwhile discussions! As always, it was also a time to talk about the industry and its current problems and challenges, so it was inspiring on many levels.
If you are interested in implementing Kafka and think it is the solution for your business or want to make sure it is, do not hesitate to contact us! We're here to assist you in making the most of Confluent's methods and tools.
For now, see you next year!