paint-brush
Event Streaming vs Event Sourcing: Maximizing Business Efficiencyby@datastax
1,367 reads
1,367 reads

Event Streaming vs Event Sourcing: Maximizing Business Efficiency

by DataStaxMay 15th, 2023
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

What’s the difference between event streaming and event sourcing? When is the right time to choose either pattern? In this post, we look at each approach with event driven architectures and learn which is right for your solution.
featured image - Event Streaming vs Event Sourcing: Maximizing Business Efficiency
DataStax HackerNoon profile picture


Customers like to be aware of events when they happen. After a customer orders a new pair of shoes and receives a notification that the purchase has been shipped, getting up-to-the-minute shipping status updates before it arrives improves the overall customer experience.


The updates about your order are events that trigger a response in an event-driven architecture (EDA). An EDA is a software design that reacts to changes of state (events) and transmits these events using a decoupled architecture. This decoupled architecture can employ several design patterns like the publish-subscribe (pub-sub) pattern, where a producer publishes an event and a subscriber watches for events, but neither is dependent on the other.


Event streaming and event sourcing represent two ways that organizations can power their EDAs.


With event streaming, there’s a continuous stream of data flowing between systems, with the data representing a new state of events broadcast using the pub-sub pattern. Event sourcing, on the other hand, stores every new event in an append-log. This serves as a source of truth containing a chronological order of events and contexts.


Event sourcing and event streaming are often used side by side in EDAs, but it’s important to distinguish the two as they work very differently. While event streams promote more accessible communication between systems, event sourcing provides event history by storing new events in an append-only log.


Here, we’ll discuss both event-coordinating methods and provide a few use cases for each.


Event Streaming: Decouple Your Services

Event streaming employs the pub-sub approach to enable more accessible communication between systems. In the pub-sub architectural pattern, consumers subscribe to a topic or event, and producers post to these topics for consumers’ consumption. The pub-sub design decouples the publisher and subscriber systems, making it easier to scale each system individually.


The publisher and subscriber systems communicate through a message broker like Apache Pulsar. When a state changes or an event occurs, the producer sends the data (data sources include web apps, social media, and IoT devices) to the broker, after which the broker relates the event to the subscriber, who then consumes the event.


Event streaming involves the continuous flow of data from sources like applications, databases, sensors, and IoT devices. Event streams employ stream processing, in which data undergoes processing and analysis during generation. This quick processing translates to faster results, which is valuable for businesses with a limited time window for taking action, as with any real-time application.


Event streaming provides several advantages for businesses; here are a few:


Improved Customer Experience

Event streaming and processing offer organizations the ability to enrich their customers’ experience. For example, a customer placing a dinner order can get instant status updates, notifying them when the delivery vehicle is on its way to their location, or if it has arrived. This heightened customer experience translates into more trust, better reviews, and improved revenue.

Risk Mitigation

Applications like PayPal and other financial technology applications can employ event streaming to provide online fraud detection to enhance security using real-time monitoring. Fraud algorithms test the circumstances of an event (purchase or transaction) using predictive analytics to detect a deviation from the norm (an outlier). If the system detects an outlier or unusual event, it stops the transaction or blocks the card from completing it.

Reduced Operational Costs

By analyzing event streams, industrial tools can log performance and health metrics to assess equipment health. This feature enables organizations to perform predictive maintenance on machines before a total breakdown, which costs more to repair. In manufacturing, for example, organizations can employ Pulsar streams to aggregate and process data from machine parameters like temperature or pressure. Engineers could set a machine’s maximum temperature and set an alert that would be triggered if that temperature is exceeded. Machine operators could perform checks and maintenance before more costly problems occur.


How is Event Streaming Used?

Event streaming is essential for businesses and applications that stream a high volume of data and depend on fast, actionable insights. These applications include e-commerce, financial trading, and IoT devices.


Financial trading applications employ event streaming to publish time-sensitive events where customers want to act immediately. For instance, users may subscribe to a backend service that sends updates on specific events, like a change in stock price, to enable timely decision-making.


Event streaming also has risk- and fraud-detection applications in financial systems that process payments and other transactions (and block fraudulent transactions). Defined fraud algorithms can block suspicious transactions by analyzing data immediately after its generated.


Event Sourcing: An Ordered History

Event sourcing stores data as events in append logs. The process captures every change to the state of an application in an event object and stores these event objects as logs in chronological order. With event sourcing, event stores compile the state of a business entity as an event in a sequence, and a change in the state, like new orders or the cancellation of an order, appends the latest state to the list of events.


For event sourcing to work efficiently and consume minimal resources, each event object should only contain the necessary details. That minimizes storage space and prevents the use of valuable resources in processing data that leads to non-actionable insights.


Event stores compile business events and context; appending long streams to the event logs consumes database storage quickly. Keeping only the necessary event contexts as part of the event object helps free up storage space for adding multiple event logs, which drive actionable insights.


Organizations may choose to use “snapshots” to help optimize performance in such cases. Snapshots enable storage of the current state of the entity. Knowing the current state may only involve pulling the snapshots and recreating a timeline to know the most current state.

Let's illustrate this. Suppose we have a database that takes stock of recent items in an e-commerce store:




Most databases store only the current state. If we were to account for the journey to how we arrived at the final stock value of 91, there would be no certainty or clarity as to how we got there. Event sourcing records every state change in a log, making tracing event history for root cause analysis and auditing possible.


This image illustrates event sourcing and shows three events, each with the database's date, quantity, and type of item. In this case, we can perform a traceback on how we arrived at the final amount of 91.



Healthcare organizations are one of the most heavily regulated industries, with ever-changing regulations to protect customer information. They need a flexible storage solution that adapts to growing data needs while maintaining easy migration of legacy systems to newer technologies.


By employing event stores as their single source of truth, healthcare systems can rely on the immutable state of event logs for the actual state of their data and make valuable projections by employing real-time stream processing. Retail and e-commerce businesses could gain better knowledge of their customers by analyzing large, durable event stores, which helps them create more personalized customer experiences.


Differences Between Event Streaming and Event Sourcing

There are a few similarities between event streaming and event sourcing. For one, each event coordination method employs a decoupled microservices architecture, which helps improve scalability and performance.


Although event stores and streams differ in state durability, they are essential in providing the current event states of applications for use in analysis and driving business decisions. Also, both event coordination methods possess durable storage capabilities, although event stores usually offer longer extended storage than event streams.


Here, let’s delve more into some key differences between event streaming and event sourcing.


Optimization

Event streaming is optimal for more accessible communication between data in motion by decoupling publishers from subscribers and making it easy to publish millions of messages at high performance. On the other hand, event sourcing helps establish event history by storing every new state of an entity in an append-only log.

Data Movement

For event sourcing, data exists at rest because events are immutable. However, event streams involve data always in transit, passing between multiple storage systems like databases, sensors, and applications.



Wrapping Up

Event streaming and event sourcing help coordinate events in an event-driven architecture. Although their use and value are different, they work well together to help build a durable and high-performance application.


Event streaming employs the decoupled pub-sub pattern to continuously stream data from various sources, which helps drive business decision-making. Unfortunately, although event-streaming tools may possess durable storage, they’re not designed to store messages for long, as the durable storage features only persist long enough to make them fault-tolerant and resilient.


One can view event sourcing as a subset or component of event streaming. Event sourcing appends a new event to the current list of events in an ordered manner. It can also act as a source of truth for reliable audits and obtaining the current state of events at any time. Event sourcing is crucial for financial industries with heavy regulatory and audit requirements and a reliable store to trace and build the current state of events. In contrast, event streaming is crucial in financial-trading applications where actions have a time-bound window and require immediate action.


EDA isn’t necessarily a destination. It’s a path to follow, driving certain system performance and characteristics. For example, event sourcing decouples a collection of microservices so they become less dependent on each other. This drives resilience and easier iteration, among other benefits. Combined with event sourcing, microservices gain the ability to replay events as well as a full log of changes for a given feature like a user’s profile. This kind of architecture opens new possibilities within existing systems.



Also published here.