Today’s managers are facing two challenges making business increasingly complex, especially against the backdrop of volatile market conditions. One challenge being faced is, every 18 months the amount of data on planet earth doubles itself, namely the very foundation of information overload (1). Therefore, it is virtually no bother to find information; rather it is extremely challenging to weed out the right information in order to acquire a useful body of knowledge. The second challenged being faced is, innovation cycles are constantly shortening, which makes it extremely difficult for literally every industry to simply keep up. Thus, what is state-of-the-art today can be obsolete tomorrow (2). This is why speed at which decisions are made is paramount.
Bottom line: Decision makers need to carry out queries that are accurate and in real time to facilitate the overall decision making process. Yet, there many hindrances, such as: information overload (complexity) and volatile market conditions (uncertainty) making the queries last too long to be up-to-date, meaning the output is outworn and/or useless. However, there is an antidote, namely: In- memory-computing. In-memory-computing is the quantum leap in terms of making hardware as well as software ready for real time business analyses on the basis of database queries.
What can be inferred from that?
Queries that required weeks can now be carried out in minutes: This brings about the advantage that executives can inquire additional questions; a classic example would be a manager wanting to identify the worst-selling product; the response is Product A; the next query is then why is Product A not selling the way it should; the response is supply chain problems caused by weather difficulties. If a manager has access to this information within minutes, he can take immediate action to overcome these shortcomings by, for example, presenting alternative approaches that do not involve weather obstacles.
How does in-memory-computing work?
In-memory-computing enables users to quickly analyze gigantic amounts of data in a cost efficient manner while reducing the overall IT-landscape complexity (3): Especially industries such as the healthcare industry which processes millions of datasets coming from various different sources. Under normal conditions obtaining detailed insights into data would take far too long, however, in-memory-computing offers the possibility to perform it in seconds juxtaposed to the traditional database approach. This is the case because in-memory-computing manages data in columns instead of rows. This benefit means dealing with less amounts of data because redundant or superfluous fields are minimized, so that finally less fields have to be touched, by sticking to the essentials, resulting in higher speed that amounts to 10 to 50 times faster processing (4). Furthermore, in-memory takes advantage of storing data in the main memory instead of the hard disk which results in further improved processing power as the access to the main memory is 120 times faster than from disk (5); here simplicity, that is, indeed, an easier digestable approach for the computer, is key.
Bottom line: The data compression (columns instead of rows) along with utilizing the main memory (store data in main memory instead of hard disk) brings to the table remarkable improvements.
What is in it for companies?
✔Real-time analysis at the speed of thought
✔What took weeks now takes seconds
✔Faster decision making
✔More agility and flexibility
✔Speed allows to come up with second and third questions to narrow something down
✔Lowered cost because of compressed data, main memory has a low cost/size ratio
“We are witnessing the dawn of a new era in enterprise business computing, defined by the near instantaneous availability of critical information that will drive faster decision making, new levels of business agility, and incredible personal productivity for business users.”
Bill McDermott (Co-CEO, SAP, Newtown Square, Pennsylvania, USA)
(5) Plattner H.,and Zeier A.. In-Memory Data Management. An Inflection Point for Enterprise Applications. Berlin, Heidelberg: Springer Verlag, 2011. Print.
Philipp Stoeckler 2012/01/06 04:01