Why In-Memory Needs Collaboration to Tango
I stumbled upon this insightful blog post by SAP CIO Oliver Bussmann about how SAP realizes value from its own SAP HANA (High-Performance Analytic Appliance).
HANA is powered by in-memory computing – a way to store and process data in the main memory as opposed to disk storage . For a primer on In Memory, see this video of Hasso Plattner embedded in a post by ZDNet’s Dennis Howlett and this piece by CIO.com’s Chris Kanaracus.
Citing a use case, Oliver writes:
Here’s the problem many companies face today: global executive pipeline reports are at least a day old, making real-time decisions and tactical adjustments impossible. In-memory computing allows you to process huge amounts of real-time data in the main memory of a server to provide instantaneous results from analyses and transactions. By the time critical information or trends reach decision-makers, it could be too late. The benefits of in-memory computing are phenomenal – imagine being able to access real time operational information within seconds.
That’s a pretty amazing feat – surely to be appreciated by technologists who have lived through various generations of computing evolution as well as business struggling to make timely decisions.
But there’s a missing piece. I’m not belittling the value of in-memory in any way but Oliver’s post made me think hard about what’s needed for the benefits of In-Memory processing to permeate business process in a scalable way. And my conclusion was this: unless the system is also going to magically make a decision or auto invoke an action (e.g. transact or place a stop order on a check) based on this real time insight , we have a universal bottleneck in that our decision makers who need to band together to use this data are woefully scattered (and worse, unknown) across most organizations today.
Of course, certain decisions are made by individuals and in those cases, there’s direct value from this technology. And if were recreating BI with these nifty advancements for the benefit of the executive brass all over again, that’s fine too. But the next wave of analytics needs to be pushed down into the hands of teams and line individuals to truly drive performance. And for that, we need a strategically designed collaborative fabric that can locate the right people to group together to leverage this real time data, facilitate the decision in an auditable fashion and update systems of record with better, more timely data, accordingly. Well designed collaborative plans will leverage dynamic rich identity profiles and and the appropriate collaboration metaphor (streams, project spaces, etc) to create that perfect compliment to real time data and intelligence access. Together, these two advancements comprehensively accelerate process performance.
Today’s often siloed ERP system-based designs stand in sharp contrast to a more people centric enterprise footprint that is needed to improve discrete non repeatable business output. In-Memory has tremendous promise and I loved the demos at SAP Sapphire. And I expect to see more at SAPTechEd next week. But I hope its’ wide scale adoption doesn’t stutter due to an acute case of technology innovation outpacing practical, scalable, real world applicability.
I remember how we used to admire Cisco’s ability to do a virtual close on its accounting books every day. That was a cool technology feat for its time. But it didn’t do much to help the company preemptively respond to the downward demand forecast thanks to the market crash of the dot com days or the recent recession. Analytics and Business Intelligence capabilities needs to leave the top floors and corner offices and become an active tool for all managers and workers in the enterprise. In-Memory brings that sophistication no doubt but collaboration federates the use of this amazingly accurate snapshot of progress-in-the- moment.
Subscribe to this Blog via Email
SVP, Enterprise Social and Collaborative Software, SAP