Not surprisingly, your business has collected a lot of data over the past few years, and you have used some analytical databases or data warehouses to organize and understand your insights. Congratulations, you have taken the 1st step with your data strategy, and produced analytics that will help drive your business!
Meanwhile, your applications need to operate in real-time to make immediate decisions on streaming data or data in motion on both your customer facing applications that drive revenue or build brand loyalty – as well as those internal applications that help your business operate efficiently, reducing bottom line costs. Using analytics directly in the applications can help apply what you have learned to automate your business workflow and remove the manual components.
As an example, with credit card fraud detection, you ideally want suspicious transactions to be stopped or run through extra security validation. If the systems can’t stop the fraudulent transactions during the purchase approval process (in-line or in transaction), and fraudulent behavior is detected post transaction, you leave yourself susceptible to the fraudsters which can be a very costly proposition. Not only do banks and merchants have to do all of the legwork to reverse the transactions with costly chargebacks, but you also leave yourself exposed to additional fraudulent activity as you try and roll back the transactions. Even though your analytics are able to accurately identify and flag fraud on suspicious behavior, living in a post event or reactive state means manual processes must be done which creates a window of opportunity for fraudsters. The question is how to operationalize the analysis so that it can not only detect the fraud, but also prevent and act against the fraudulent events in-line during the transaction. You are now thinking in a proactive state.
The underlying mechanism of acting within an event and being able to run a number of rules or apply business logic in single milliseconds extends beyond just fraud use cases, and can be seen where applications demand their systems to make more intelligent decisions based off 1,000 plus rules in less time to meet the data driven, real “real-time” needs of their end users.
Turning Analytics into Real Time Decisions
Let us take a look at what you need, from the backend database perspective, to apply such intelligence and analytics to enable operational decisions.
- Execute numerous complex rules and business logic within each individual transaction and finish it in milliseconds. We live in a world where users expect lightning quick interaction within their applications that often translates for result or decision to be provided within 250 milliseconds. Out of that latency budget, about 50 milliseconds are left to your database in order to respond to the end user.
- Accommodate hundreds of thousands – even millions of concurrent users of the application. The system must be able to handle the requests of these users at a transactional level and at the same time.
- Consistently deliver the performance. 100% of the customer requests should be returned within the budgeted latency (milliseconds).
Bridging the Gap
Having established your analytic strategy within your data warehouse, you might think: “Why don’t I just use it to support our customer facing applications?” The assumption is your current solution supports “real-time analytics”, but are they really capable to empower customer facing applications directly? A few things to consider:
- These databases process the transactions in batch or micro-batching fashion and their latency is typically seconds. If they don’t offer immediate data consistency, it could take even longer to access the data that is just loaded. These databases are not designed to fit into a latency budget of 50 milliseconds or less.
- These databases are designed to optimize a small amount of complex queries on large amounts of data, but not for massively concurrent queries to support a lot of users. For simple applications, you may use streaming or caching tools to support the concurrent queries. However, these tools will probably struggle to support any frequent writes and updates that your system needs to do.
- If these databases don’t provide full SQL support, you may have to write a lot of codes in the applications to fill the gap. That makes it harder for the applications to leverage the optimization in database technology and offer the performance consistency needed.
How Volt Active Data Helps
Volt Active Data is an in-memory database that is designed to bring the analytics to life by empowering them to serve customers directly for the following reasons:
- A highly innovative architecture, developed by Michael Stonebraker, et. al, has eliminated many bottlenecks in the legacy databases and enables highly parallel execution on distributed data stores. It provides very low latency of single digits of milliseconds and high throughput of millions of transactions per second. As we discussed above, these capabilities will enable your applications to provide the interaction in real time for all your users.
- Stored procedures that move logic to the server side, thus placing processing near the data. This empowers the system to process millions of transactions per second in production while achieving consistent performance.
- The highest level of ACID compliance — serializable isolation — to make sure there is no data loss with minimal programming within the application.
A Global Bank Use Case
I recently met with one of Volt Active Data’s Global Banking Partners who has deployed the Volt Active Data technology within their Credit Card Center. They previously used to detect credit card fraud transactions in a post-event manner. They were required to go through the manual process to flag the fraudulent activities and then call the merchants to reverse the transactions. In order to automate the process, they first collected the complete data about the cardholders and their credit card usage, and then used Hadoop to implement the deep analytics on the data. These analytics have enabled them to understand the fraudulent behaviors and build advanced rules in Volt Active Data to prevent those fraudulent behaviors.
Volt Active Data runs these rules to analyze the credit card transactions in the purchase approval process to prevent the fraudulent transactions from going through. When a credit card is swiped, Volt Active Data has only 50 milliseconds to run all 1,500 rules to determine whether the transaction is fraudulent or not. If it is determined to be fraudulent, the system either cancels the transaction or requires the user to enter more security information. The system currently handles about 10,000 credit card transactions per second.
Since the bank started to incorporate the fraud prevention platform into the credit card purchase approval process, the number of fraud cases has been reduced by 50%. This has also resulted in annual savings of over $15 million dollars for the bank.
Conclusion
Analytics as a part of your big data strategy gives us the insight that powers your business. Applying the analytics in real time applications, such as fraud detection, risk/portfolio management or regulatory compliance, will maximize the value of your analytics and automate your business process.
Stay tuned for my next blog in which I’ll cover how to leverage machine learning-powered analytics in real time. In the meantime, feel free to check out our Technical Overview highlighting the capabilities of Volt Active Data.