In my previous post, “The Analytics Blind Spot for Business Users“, I examined business users’ perception of their BI capability, and how frequently they’re not getting the value out of their business data. This week, as part of our ongoing series on How BI is Being Used in 2012, I want to take a closer look at how well organizations are able to handle the three forces most responsible for the challenge of big data: volume, velocity and variety.
In our research, we found that most companies, regardless of size, felt that they were able to handle the volume (i.e. the amount of data being collected and analyzed) and velocity (i.e. the speed at which the data is being collected and requested). Overall, 77% felt their IT infrastructure was adequate to handle the volume of data they would need in the near term, and 69% felt they had adequate infrastructure to handle the their companies’ data needs when it came to velocity.
An area where they weren’t as confident, was in the variety of data (i.e. the increasing sources of data created by new technology such as web logs, social media, RFID information, etc…); only 39% indicating that they had adequate infrastructure.
So what does this mean? Here are a few takeaways.
1. You might not be as prepared as you think for big (or even not-so-big) data
Business users are getting savvier. Although most companies’ IT groups felt confident in their ability to handle the volume and velocity of data that their business users require, they may not have factored in the growth in demand from those users. It won’t just be a matter of churning out more reports as more users become aware of the benefits of analytics (although we often see this “growing pain” for companies getting more mature with BI), it will also be a matter of providing more sophisticated analytical products.
2. Success with big data volume is tied to the success of enterprise data management
Many IT managers and executives view the volume of their data being a function of the number of transactions they’re expected to collect in their transactional systems. It’s actually quite a bit more complex; volume is also dictated by how a company is able to aggregate, consolidate and correlated that data. In telecommunications and social networking, there’s a theory called “Metcalfe’s law” which says, essentially, that the value of a network grows exponentially as the number of nodes in that network grows linearly. Applying this principle to big data, the connections between transactional data causes the volume of data to grow exponentially.
Managing this data would be challenging enough if there were a single version of the data, but for many companies, this isn’t the case. Data often proliferates around the organization and each department creates their own data sources with their own definitions, leading to conflicting, inconsistent and sometimes just wrong data. Part of a good “big data” strategy, then, is good enterprise data management. EDM helps you get control of your data. It takes a holistic view of how data is managed across the enterprise and across its lifecycle, from initial creation through eventual retirement.
3. There may be better problems to solve with your data than variety
The infrastructure needed to analyze these new forms of data does not integrate with traditional transactional systems well. The investment needed to create an integrated view of this variety of data (i.e. structured and unstructured) is, for the near term, cost prohibitive for all but a handful of companies.
This is especially true for those companies who are underutilizing their current data. And as we’ve seen from the previous survey questions – where we see companies relying mostly on traditional reporting or scorecards and dashboards – there is a significant amount of value to be gained from improving advanced and predictive analytics. In short, companies looking to achieve near term ROI from BI would be better served by improving their basic BI capabilities.
Companies that are able to develop and manage the technical needs for big data – including the creation of more sophisticated analytics products, and the development of enterprise data management programs – are one step closer to realizing significant value from their analytics programs. But they’re not there yet. The infrastructure is there, but the processes and skills needed to leverage that infrastructure for bottom line impact, more often than not, aren’t. Next week, we’ll take a look at the non-technical challenges companies face with BI, and provide some insight in how to overcome them.
by Adrian Alleyne, Director Market Research
© DecisionPath Consulting, 2012