As 2023 gets going, most companies are still pretty obsessed with “data”—as we are. Data is “the new oil”. You knew that already. But trends within the vast data realm come and go and some of them are clearly coming while others are clearly going.
As it turns out, 2022 turned out to be a big year for data. A quick recap:
- 5G started to come out of the lab.
- The cloud continued its impressive trajectory to international tech stardom.
- And Big Data continued to recede into the shadows of Big AI.
So what does 2023 hold?
Here are the five data trends that our own field experience tells us will matter quite a lot next year and beyond.
Table Of Contents
1. Transaction volumes will skyrocket
Everybody understood that IoT would create a lot of traffic, with “10x” being the most oft-cited number. But when companies also embrace microservices and REST APIs, they provoke a massive increase in the number of transactions because certain functions (like authentication) that now need to be done for each and every microservice. This means we can expect scenarios where transaction volumes are getting dramatically higher, even though the functionality remains more or less the same.
While nobody is seriously suggesting that we abandon microservices or the IoT, 2023 will force companies to confront the fact that they’ve solved one problem while creating another. Stateless systems have always been easier to scale than stateful ones, and asking for a 10x increase in TPS often means using a substantially bigger server.
2. Data ownership, security, and stewardship will become major issues
Events like Brexit, combined with increasing consumer and regulatory sensitivity to data breaches, meaning that people are now, more than ever, asking where their personal data is kept and what steps are being taken to secure it.
Legislation like Europe’s GDPR and California’s CCPA are giving end users far more of a say in what happens to their data, which is a good thing. But this isn’t just a paper-based, box-ticking exercise—companies may find that their giant database of user data now needs to be split into multiple smaller ones for different legal jurisdictions.
The ability to control how and where your data is kept is going to matter, and companies will need to make sure they can keep all the data within national borders and have it in multiple geographic locations.
3. TCO will determine decisions
Left to themselves, developers generally pick the ‘best’ technology first and then worry about TCO later, if at all. But in a world where TCO is as important as rapid feature delivery, expect to see a shift in emphasis from benchmarks where people tout high TPS numbers to ones where people talk about low costs per transaction.
4. The API economy will have winners and losers
Publicly exposed standard APIs are becoming ‘a thing’, especially in the telco space. But there’s more to joining the API economy than just exposing your existing applications. For example, your existing API may make implicit assumptions about what’s called in what order, or you may be asked to implement a standard API that doesn’t properly mesh with anything you’ve done, even though you’ve solved the same problem. This creates risk for legacy incumbents and opportunities for startups.
5. AI/ML will earn its keep (keyword, “earn”)
An awful lot of time, energy, and PowerPoint slides have been devoted to AI and ML. At Volt we have customers using AI and ML in production for everything ranging from hyper-personalization to fraud prevention.
In our experience, success with AI + ML projects requires two things:
- The use case actually makes commercial sense. Just as not every story has to have a happy ending, not every proposed AI/ML use case will make it to production. In the current economic climate, a clear economic benefit should be a prerequisite for going live with any AI/ML project.
- Being able to update model inputs in real time. Even when a use case works at the desktop level, we’ve seen failures caused by models that can’t or don’t keep up with real-world events, or even their own previous decisions. Unless you track and update state for each and every call to your model, it’s like trying to play chess without knowing what the board looks like.
Companies are investing a lot of money in AI and ML right now, but, given the times we live in, it’s hard to see how AI/ML projects that aren’t materially impacting the bottom line can avoid tough auditing in 2023.
Conclusion
Forecasting the future is hard, but in each of the cases we look at, there are clear signs that big changes are coming in 2023, and they will ultimately be good changes: things that make apps run faster, safer, and more affordable. That said, global challenges are definitely forcing a focus on TCO above all—if it doesn’t help the business, it doesn’t get funded.
Volt’s always focused on use cases that we know impact companies’ bottom line, and our technology has always been geared toward keeping things running for the sake of the business. We can run at 2-3x the speed of alternatives, and sometimes around 9x the speed, due to the time, we devoted to implementing the core of our product in C++, an energy- and resource-efficient language. We’ve also roadmapped OpenAPI support and have running prototype implementations that eliminate the need for an application server, and our patented Active(N) technology allows an unprecedented level of geo-replication.
Get started with Volt here.