IT leaders have long searched for better ways to understand their IT costs. There are many ways to go about it, from spreadsheet manipulation and homegrown tools to slick business intelligence platforms. 

These approaches may get the job done – or not – but are sure to be riddled with errors, limitations, unnecessary expense and extensive manipulation (you can read more about spreadsheets fails and other bad approaches to cost analysis in my previous posts). 

So before you are wooed by a business intelligence tool and find yourself disappointed with the output, I encourage you to carefully consider what you are hoping to achieve. I’m not a betting man, but I’d gamble that more often than not, a business intelligence tool isn’t going to give you the level of cost analysis you’re after. Allow me to explain…

Customizing a Business Intelligence Tool

Many organizations recognize the value of transitioning from the reactive ad hoc approach to cost analysis to one with more structure and repeatability, often resulting in a regular cadence of quarterly business reviews or monthly variance analysis. 

In order to provide consistent input to these processes, some IT organizations build custom solutions on business intelligence platforms. At their core, business intelligence tools contain high-performance calculation engines that perform aggregation math on millions of rows of tabular data. These algorithms automatically group related rows based on specific column values and calculate group-level metrics such as sum, average, and so on. They perform these functions in multiple dimensions. For example, instead of just summarizing total costs per month (time dimension), or total database-related costs (technology dimension), they also calculate the combination of those two dimensions: database costs per month for every month. 

This concept is similar to pivot tables in spreadsheets, but much more powerful and scalable. Imagine performing the math described above for potentially dozens of dimensions. Business intelligence tools analyze these combinations of aggregations across many dimensions for millions of rows and do it all in advance of when the user needs to see the results. As soon as new data arrives, aggregation math is calculated for every defined intersection between dimensions and the results stored, so that when a user wants to view the data, there’s no waiting for very complex calculations to run – the system simply retrieves the pre-calculated result and instantly displays the report. 

Since most IT organizations already have business intelligence technologies (and associated staffing) in-house, it makes sense to explore whether that investment can be leveraged for IT cost analytics. Furthermore, leading enterprise resource planning vendors offer their own business intelligence platforms, making them a natural extension for analysis of financial data. Unfortunately, there are concrete, practical reasons that this approach consistently fails for IT cost analytics.

Why it Fails for Cost Analytics

The simplest approach to using business intelligence tools for IT cost analytics is a variation of the finance reporting and spreadsheet methods described earlier. In this approach, general ledger entries are fed into and analyzed by the business intelligence platform. While they are great at aggregating and summarizing tabular data, these tools cannot invent new granularity where it didn’t exist in the raw data. The data feed from corporate finance fundamentally lacks the granularity needed for effective IT decision-making. 

Using general ledger data, a business intelligence platform can easily aggregate cost records by time, cost center, and account code. This yields metrics such as database costs broken down by hardware, software, labor, etc., and summarized for each month. While valuable, this only scratches the surface of what’s needed for effective decision-making. An IT leader cannot make a stand-alone choice to retire a database platform. Such a decision is much more likely to be made at the level of the applications that depend on the database platform, and very often in IT there are multiple applications or services that depend on such underlying technologies.

Consequently, it’s not enough to simply pool together all of the database-related costs. IT leaders need to know how these pooled technology costs support other elements in the IT supply chain such as applications or services. By the same token, decisions and conversations about technology need to account for the demand-side. How much of each application is being consumed by each business unit? By extension, how much of each application’s total costs should be attributed (or allocated) to the consuming business units? 

Answering questions like these requires going beyond the pooling of costs from the general ledger as done by business intelligence tools. In fact, it requires a sophisticated “dis-aggregation” or rules-based apportioning of pooled costs among the consuming technology or business unit elements. This routing of costs must be defensible and explainable to technology owners or business unit leaders in order to earn their buy-in and reliance for business decisions. Unfortunately, business intelligence tools lack the necessary business logic for allocating costs in an intelligent fashion, routed by technical relationships and weighted by actual consumption data.

More to Come

Tomorrow, I'll cover the use of rate cards in IT costing with business intelligence tools, hidden flaws in data, and why these systems don't do well with change. Come back to learn more about these approaches. In the meantime, be thoughtful about your requirements and ask the right questions when purchasing new enterprise software for your cost analysis, and you can be more confident that you are making the right decision for your business.

Like what you’re reading? Sign up for our blog daily digest