We’re reached the moment where old information pathways won’t work anymore. Managing right here, right now requires a framework that we’ll get to in a moment. First some history.
Growing up before the Internet, ‘right now’ technology showed up as the rare long distance call or watching live television coverage of an Apollo launch. Everything else was an offline transaction of mail, checks, and standing in customer service lines. The world played catch up with paperwork, often after hours and overnight. Playing catch up is now a part of history.
Check is not in the mail
In our globalized world, there is no ‘overnight’ and the check is no longer in the mail. Thanks to my iPhone and my USAA, I deposit checks and have instant credit simply by snapping a photo. ‘Settlement’ in financial circles is now a constant activity. In Wall Street’s world, trades occur so rapidly, often kicked off by algorithms rather than the human hand, that systems can no longer calculate a daily Value at Risk, or VAR, as they did when I worked in energy trading just ten years ago.
Ubiquitous Data
Considering these new realities, the ante is upped significantly when Ubiquitous Data (my term for the misleading term Big Data) is added to the mix. There are so many ways to gather and analyze information that didn’t exist just a few years ago. Sentiment analysis, pricing, supply chain management, and information security all rely on absorbing data from human and machine sources moving at constant, full speed.
The old-school business processes that use that data are also falling away. By the time we plan the logical flow of a new piece of business, it has changed under us.
The old ways of doing business won’t work anymore. Those ways are simply too slow and rely too much on structure, planning, and ‘catch up’ techniques like batch processing and capacity planning. The old ways can’t handle changing processes and Ubiquitous Data in a 24 x 7 world. But there’s a new way.
Going in-memory
If you think about your laptop’s cache, you’ll remember that it clears itself every time it loses power. It’s the reason why your PC could be fixed by a reboot. But in our always-on world of ubiquitous data, losing data won’t work. Leave it to technology, we can now safely back up information through redundancy and write-to-disk in the same way we could before in-memory became financially and technically better. It comes without risk and with much faster speeds that let us do things we could never do before.
We gain scale by clustering commodity hardware as needed to gain enough in-memory storage and processing speed for the requirements of the moment. Need more? Simply add more hardware. It is more efficient and doesn’t require locking down resources you may only need in the future. And the administration that was once required to ‘maximize the database’ doesn’t exist. It is a rare win-win.
In-memory architecture
Gartner’s Massimo Pezzini presented his view earlier this year of what he calls, The Next Generation Architecture: In-memory Computing. I share his view that order-of-magnitude changes in how we manage information right here, right now require new technologies. Getting there means taking an in-memory approach that is more than just an in-memory data base (IMDB), the misleading but popular conversation point for right here, right now technology. IMDB only solves part of the problem.
Pezzini’s approach calls for an infrastructure that builds in-memory applications that perform analytics, event processing, and application services atop in-memory storage and in-memory messaging. There’s no single point of speed or latency, but instead a framework that supports very fast transactions and queries while maintaining the flexibility to change with the business.
As Pezzini points out, this approach opens up remarkable opportunities for self-service applications, unconstrained data exploration, and visualization of very complex information. I see the same architecture used for gaming, customer retention, energy efficiency and a host of other solutions. These are new ways we compete with speed, mobile, Cloud and SaaS, and in-memory architectures are core to making it all work.
Pezzini’s approach builds on the need to stay flexible while having speed and scalability to handle the hard-to-predict changes that are always coming.
Chris,
Interesting concept put forth by Massimo. I’m going to contact him directly as a result of your blog. I fully support what you both have said. Do you consider this statement “The old-school business processes that use that data are also falling away. By the time we plan the logical flow of a new piece of business, it has changed under us.” as being resolved by your event processing?
Our software, by this definition, is an in-memory application platform. Our pieces address those failing business processes; allowing for more data, efficient processes, sophisticated logic, analytics on our application server. The old-school business systems don’t have to change, either. It is inefficient, expensive and disruptive to modify, upgrade, integrate or replace these systems. Leave them to what they do; decouple the failing process and innovate true business value faster. We’ve worked in manufacturing, retail and healthcare.
Do you think the phrase in-memory application platform is going to stick or was it only first introduced recently?
I’m not promoting a particular software, just covering an idea. I think decoupling is critical and that current business systems fall away in the way they’re implemented, not that they need to be tossed.
I think your last question is bait, no?
No, I’m not nearly clever enough to leave bait. Our company operates in relative obscurity and this is the first technical description that appears to be in line with our solution. I was literally asking if you think that phrase is established or not.
OK, gotcha. Just keep in mind that I’m trying to be somewhat vendor neutral.