Get Started Free
‹ Back to courses
course: Apache Kafka® for .NET Developers

What is Event Streaming?

3 min
Wade Waldron

Wade Waldron

Staff Software Practice Lead

What is Event Streaming?

Overview

Modern applications need to respond to events triggered by users as they happen. Waiting minutes or even hours for a batch job is no longer acceptable. But we can't necessarily respond synchronously because that creates contention and bottlenecks that can slow our system to a halt. Instead, we rely on event streaming to allow near real-time, asynchronous processing of events. These systems are built using tools such as Apache Kafka. In this video, we'll discuss how we can use event streaming to implement an asynchronous communication strategy.

Topics:

  • Synchronous Communication
  • Asynchronous Communication
  • Events
  • Event Streaming

Resources

Use the promo code DOTNETKAFKA101 to get $25 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.

What is Event Streaming?

Hi, I'm Wade from Confluent. In this video, we're going to introduce event-driven systems. When a user interacts with an application, there is an expectation that it will respond in real time. However, the reality is often more complex than it seems. Some operations may need to respond immediately but many will benefit from taking just a little bit longer. Consider the example of a fitness tracker. Wrist-worn trackers will often record data, such as heart rates and step counts, and then periodically push that data to the user's mobile device. The mobile device will then send the data to a remote server somewhere. From there, we calculate a variety of metrics that can be exposed to the user on their mobile application. We could do all of this in real time using synchronous communication. When the mobile device contacts the remote server, we could compute all of the necessary metrics while the device is connected, and then send those results back to the device. However, those calculations might take a long time. In the meantime, the mobile device is left waiting on the hook for us to finish. This means we force them to use their mobile data plan and we have no guarantees about the quality or stability of that connection. What happens if the user goes through a tunnel or enters some other kind of dead zone while they are connected? At that point, we'll lose them and we won't be able to respond. The device won't know that we've finished processing their request and it will need to try again, which just wastes resources. However, the reality is that those updates don't need to happen immediately. It would be perfectly acceptable for us to push the data to a server and perform the necessary computations later. This minimizes the amount of time the mobile device needs to be connected, and makes our system a little more robust. We do this by moving the computation to an asynchronous event-driven approach. In this case, when the device connects to the remote server, it pushes data to the server in a raw form. The server records the raw data as an event, and sends an acknowledgement back to the device. Think of an event as an object that represents something that happened in the past. Once we have recorded the data, we can asynchronously process the event. Then the next time the mobile device connects, we can send it the results. These events are usually pushed to a messaging platform, such as Kafka. They're recorded in a Kafka topic and then downstream applications subscribe to that topic. Each time a message is pushed, all of the downstream consumers will receive it, and they can then perform any of the necessary processing. Often this happens in a pipeline where multiple stages will perform various transformations and push the results back onto new topics. This technique of moving events through a pipeline is known as event streaming. Used properly, it can result in highly scalable applications that are more resilient to failure. So let's get started. If you aren't already on Confluent Developer, head there now using the link in the video description to access the rest of this course and it's hands-on exercises.