Much has been written and said about the power of analytics in today’s business world. Whether as a means of deciphering what has already happened, understanding what is currently going on or predicting what might happen, there is no lack of consensus regarding the value of data collected. For the organization that can mine and analyze it, the allure of insights that can create a competitive advantage is as an attractive, albeit potentially overwhelming, endeavor. And, like many of the information technology “movements” that have preceded it, the push for Big Data Analytics promises to equally frustrate and disappoint the half-hearted or misguided who take on the challenge. This has nothing to do with the resources or tools available, and it is certainly not bound by a lack of imagination, but has more to do with focus and expectations. To that end, it is the same problem IT professionals have been trying to solve since the computer proliferated – finding a way to leverage technology to solve actual business problems. This is what drives value for the organization.
"The purpose, or the outcome, of any system is to collect data that can be analyzed and acted upon, so having a map to go from the start to finish is a prerequisite for delivering real, usable information"
Increased Complexity of Today’s IT Landscape
Anyone who has ever been part of a large-scale implementation understands the challenges associated with delivering systems functionality to enable an evolving organization. Business processes and technical capabilities, as well as the people involved and their motivations, are constantly shifting and responding to market conditions. It can be a bit like making repairs to the car as it’s going down the road, with the driver and the landscape changing, but the speed remaining the same or increasing. That assumes the car is heading in the right direction to begin with or that its navigation system has a waypoint set to keep it on course. The purpose, or the outcome, of any system is to collect data that can be analyzed and acted upon, so having a map to go from the start to finish is a prerequisite for delivering real, usable information. That information, combined with related data from other sources and systems has the potential to drive value, if it paints a clear picture. Never before has the organization had the access to so much information to “mash up” and interpret and never before has the mire and confusion been greater than it is today.
Getting it right up front has never mattered as much as it does now. Yet the world has already been built around imperfect systems that collect incomplete datasets and systems implementations continue to be driven by factors that exacerbate this problem. Add to that the exponential increase of unstructured data elements that intersect with everything and the problem of driving to clear and concise conclusions that create business value is more difficult than ever. Best-in-class tools are rolling off the assembly line daily to address this issue. Likewise, the world is rapidly filling with experts and theories to accompany them. This explosion of activity creates complexity and the more complex the problem gets, the more it looks the same.
A New Approach That Focuses on Value
Value for the organization is created by specific and intentional efforts to either increase what is good or reduce what is bad, that is, what destroys the value. Everything that the organization does must be driven by initiatives that are tied to this principle. People, processes and technology must be focused together on moving the needle in a positive direction and the organization must be committed to measuring the results of those initiatives over time. The technology employed, then, to create, assimilate and analyze the data is consequential, but secondary to the mechanism by which that data is used to drive value. Additionally, analysis without some amount of subjective insight from the experts who know the business would be based on a near-perfect model, which is yet unseen. Engaging the subject matter experts and key decision makers in building the model, as well as interpreting and lining out prescriptive initiatives, is necessary to bridge that gap.
The argument, then, should be viewed as one of how to apply the tools, techniques and expertise available to solve a legitimate business problem. To do so begins with its definition and involves a breakdown of the problem into its component drivers. Traditional models fail here because they tend to rely solely on quantitative measures that can be described with empirical data. However, the integration of qualitative and subjective information can provide insight not available in the data stores, have a material bearing on the outcome and, perhaps most importantly, will engage the stakeholders. These assumptions can become more quantitative if the data is based on surveys or other means by which it can be multiplied.
Given the complexity of the current environments and the massive quantities of data available to influence these models, their structure should be viewed more as a network diagram than the traditional organizational or hierarchical framework. This is because everything in the modern world is interconnected and the real work is in understanding the linkages. These interdependencies between model elements tend to constrain them, as do threshold values, which do not rely on the linkages.
Each of the model elements carry properties, such as current value, upper and lower boundaries, and the direction that each should be moving. When movement occurs at the sub-element level, it impacts the overall result, pushing the “score” along an axis toward or away from a desired outcome. When elements that can have an impact, but are not controllable, are incorporated, multiple scenarios are needed to understand the impact of changing values.
This complex model, fed both with empirical and theoretical data, is really no different than a traditional model. That is, until it is flipped upside-down. When the overall outcome can be manipulated along a defined scale to the desired position, producing a set of optimized and achievable sub-element outcomes necessary for achieving the result, then the model – and the organization’s data – can be used to solve real business problems. Doing so requires an engine for performing top-down scenario analysis, using a complex set of recursive iterations, but it produces a series of value propositions that are based on the collective intelligence of the organization. Most importantly, it directs the efforts of limited resources on a continuum that all can agree is work worth doing.
There is nothing inherently good or bad about new technology trends, in and of themselves. However, the business road has been littered with projects that delivered exactly what they set out to and, yet, the value proposition was never fully realized. A new class of systems and tools have made it possible to create and analyze datasets of never-before-imagined proportions and the temptation to dive in is strong. Information technology professionals must be diligent in continuing to push for a deeper understanding on the questions being asked before attempting to provide the answers so that value is created by design.