In the Volt Active Data ebook, “Fast Data: Smart and at Scale,” Ryan Betts and I outline what we have found, through years of work and collective experience, to be tried-and-true design patterns and recipes for fast data.
High-speed transactional applications or operational applications that process a stream of incoming events are being built for use cases including real-time authorization, billing, usage, operational tuning, and intelligent alerting. Writing these applications requires combining streaming analytics with transaction processing on live data feeds.
Transactions in these applications require real-time analytics as input. Recalculating analytics from base data for each event in a high-velocity feed is impractical. To scale, it’s much more effective to maintain streaming aggregations that can be read cheaply in the transaction path. Unlike periodic batch operations, streaming aggregations maintain the consistent, up-to-date, and accurate analytics needed in the transaction path.
This pattern trades ad-hoc analytics capability for high-speed access to analytic outputs that an application needs. This trade-off is necessary when calculating an analytic result from base data for each transaction is infeasible.
For more on streaming aggregations and transactions, and additional, specific design patterns, download the eBook: “Fast Data: Smart and at Scale”.