Why data management is mission-critical for Business Intelligence

Real Life Data Management—Shipping Dock
When it comes to TCO and ROI, few are more knowledgeable than Bill Kirwin and Peter Brooks from IIIE, the International Institute of IT Economics. It was Kirwin, after all, who created the TCO methodology for IT during his tenure at Gartner in the Nineties.

So, I read with interest their recent white paper, Tableau Total Cost of Ownership – A Study of Modern Business Intelligence Implementations in the Real World.

It’s the “real world” part that caught my attention, as it’s the end-user experience of Tableau that ZAP works with every day. And IIIE’s research was similar, if more scientific, to our findings: “a comprehensive survey and a round of interviews with IT and business users, which included questions about the use of Tableau being compared to organizations’ legacy and/or competitive BI technologies.”

One of the key conclusions of their interviews jumped straight out. They list two of the top TCO drivers in developing modern BI as IT data management support and labor. For the latter, that’s:

“installation, training, data management, content development and usage… of modern BI products significantly exceed(ing) application and infrastructure investments.”

As ever, the answer is in the question, and mentioned twice above: data management. That is to say, a software system able to automatically connect to, collect and integrate all data from all sources, before securing it and preparing it for analysis. This may not be seen as the most exciting area of business intelligence by some, but it’s clearly a must.

Dashboards are, to quote Harvard Business Review, “sexy.” According to Power BI, they’re “beautiful”. Indeed, Tableau claims to harness “people’s natural ability to spot visual patterns.” But they’re reliant on quality data being ‘plumbed in’ from all corners of a business—what any dashboard provides is only as good as the information going into it. The better the data quality, the better the data insight. The potential rewards are far greater if they’re thought about for ‘discovery’ rather than just ‘display’.

Once a trusted source of rich data is in place, and those dashboards are providing lightbulb moments, Kirwin and Brooks raise another interesting issue: governance. In perhaps the most inspiring line of the entire white paper, they point out that

“Modern BI is essentially high stakes end-user computing.”

And modern BI is generally not researched and carried out by a team of backroom professors any more. Tools such as Tableau, Power BI, Qlik and any number of self-service systems have placed it in the hands of every team in a business. Consequently these “high stakes” really are in the hands of “end-users.”

Thus, security and governance in data management goes from being ‘product features’ to mission-critical requirements.

The final IIIE finding that leapt out at me considered the CIO, as opposed to CEO or COO. Regarding “the time spent in developing and using BI content,” they report, “a key swing factor… is time spent performing” four key tasks. Firstly, data source definition and, after that, what I call the ‘Three Ms’ of connectivity:

  1. Manipulation
  2. Modification
  3. Management

And, while the white paper in question focusses on Tableau, connectivity is a key factor across all BI systems especially, in their view, Power BI as it requires “the use of Microsoft products/languages such as that of DAX, MDX, Visual Studio, etc. A SQL Server data warehouse/data mart DB needs to be used in the Microsoft solution. New analyses can require technical skills to create SSAS cubes to deliver new results across new data dimensions.”

That is, of course, unless a rigorous, specialist data management system is put in place that covers all these aspects. Not only the ‘Three Ms’ (which I’ll return to in a future blog), but the three overarching elements explored here: labor, governance and connectivity.

Comments are closed.