Editor’s note: This is the second of a two-part series on business intelligence tools. (Check out part one here.) For further reading on cost analysis approaches that fall short of expectations, download, “Detours on the Road to IT Cost Transparency.”
Are you asking questions like, “How much of each application is being consumed by each business unit?” or “How much of each application’s total costs should be attributed (or allocated) to the consuming business units?”
If you’re using a business intelligence tool to answer them, you’re probably going to have problems. Decisions and conversations about technology need to account for the demand-side. But answering these types of questions requires going beyond the pooling of costs from the general ledger as done by business intelligence tools.
In fact, it requires a sophisticated “dis-aggregation” or rules-based apportioning of pooled costs among the consuming technology or business unit elements. This routing of costs must be defensible and explainable to technology owners or business unit leaders in order to earn their buy-in and reliance for business decisions. Unfortunately, business intelligence tools lack the necessary business logic for allocating costs in an intelligent fashion, routed by technical relationships and weighted by actual consumption data.
The Rate Card Approach
An alternate school of thought on IT costing with business intelligence tools involves the concept of rate cards. The idea is to work around the lack of allocation logic by assigning rates to each type of element in the IT supply chain and then using unit volumes to calculate and aggregate resulting costs.
This typically starts with the periodic (often once/year) establishment of rates that define the price of each discrete unit of various IT offerings. For example, rate cards might be defined for units like “desktop compute user” or “megabyte of storage.” Rate cards define prices that result from analysis of prior year’s costs plus anticipated cost growth. Then, each month, updated IT operational data is loaded into the business intelligence platform to enumerate the IT resources. These units are costed by applying rates from the rate card. The platform then aggregates the resulting costs to determine the total cost of applications or IT services.
On the surface, business intelligence tools seem a good fit for this rate card approach because they provide high performance aggregation of the base metrics into views by technology stacks, applications, IT services, or even business units. However, in practical, real-world usage, business intelligence tools fall short.
A Hidden Flaw
The flaw with this approach is that it provides only an estimate of costs based on rate cards that may have been calculated months before, and may exclude hidden costs from shared or overhead sources. Or it may include shared or overhead sources, but in a way that results in over-counting. This is not to say that rate cards are inherently bad, but when they form the basis of all cost calculations, they’re almost certain to yield inaccurate costs.
Most organizations that establish rate cards fail to fully account for shared or indirect costs such as datacenter power and cooling, telecom, support labor, and so on. Because these costs are not directly attributed to the rate carded element, the personnel devising the rate card may not even have visibility into them, nor do they have a straightforward methodology for attributing an appropriate fraction of these indirect costs to the rate card. Consequently, the total of all rate card-based costs is often lower than the IT organization’s actual costs. Likewise, the unit volumes upon which these costs are based often fluctuate in a way that doesn’t correlate directly to actual cost.
Furthermore, because of the lag between the annual rate card update exercise and when costs are incurred throughout the year, business conditions may have changed, resulting in rate cards that are substantially different from the current true costs for those units.
A Major Weakness
It’s tempting to rely on already-deployed business intelligence platforms to calculate IT costs, but a major weakness of many of them is their need for data to be sanitized prior to coming into the data warehouse layer. In addition to finance data, the cost calculations described earlier depend on incorporating IT operational data such as IT organization structure (business units, headcount), technology stack, application or IT service definitions, unit volume or utilization data.
Unfortunately, most of these data sources are not well structured for integration with each other, and things are further complicated when large organizations have multiple types of tools due to size, organizational diversity, or acquisition history. When these disparate data sets are brought together, flaws emerge.
Many organizations have seen their implementations of these tools stalled because the source data is not clean enough, and projects to clean them up are complicated and expensive. (And by the way, using business intelligence platforms for Technology Business Management) is no exception.
While business intelligence platforms purport to save time by automating complicated metric aggregations and exposing pre-calculated results via flexible reports, many suffer from a critical flaw. The range of questions that can be answered by those reports is bounded by what was anticipated when the reports and underlying data schema were designed. This means that either the reports themselves are static, or that the report filters and slicers have limited ability to recalibrate to answer tangential questions or drilldowns.
When an executive inevitably asks a question that was not anticipated during design of the cubes and reports, the business intelligence system often cannot provide an answer. The only remedy is to go back to the drawing board for design refinements, adding weeks or months to the decision cycle. Of course, this leads back to decisions based on instincts or estimates because facts are not readily available.
Aversion to Change
It is often said that change is the only constant. But these systems don’t handle change well. Given the crucial role of Technology Business Management (TBM) in supporting decisions that result in or from change, it’s crucial for a TBM system to keep up with that change, such as:
- Replacement of an existing data source with another from a different vendor, resulting in different source data structure
- Onboarding of new systems resulting from an acquisition, often leading to multiple sources for the same type of data
- Expanding scope of TBM to cover additional technology segments, business processes, or decision types
- Broadening TBM adoption to subsidiary, outsourced, or regional IT teams resulting in team-specific data sources
Unfortunately, many business intelligence platforms do not cope well with changes to the underlying data or the business use of their output. Changing or adding data sources requires the engagement of experts to capture, clean, and ingest the new data, and data warehouse experts to adjust the schema to accommodate the data. If this is a new type of data, then cube and report designers will need to make further changes.
Making the Right Choice
Organizations that attempt this route discover that they’ve bitten off more than they can chew. Or worse, enterprise software purchases fail to deliver expected ROI because the wrong products are selected, as CIO.com recently reported. Don’t let this happen to you. An implementation gone awry or a product that doesn’t meet expectations can be a credibility killer, or even hurt your career.
At Apptio, we find that individuals charged with making these important software procurement decisions are not always well-equipped to understand how new enterprise software works with their existing IT tools and systems. Inevitably in a sales process, someone will say, “Don’t we already have (insert almost any business intelligence, enterprise resource planning, corporate performance management or other software here) for that?”
By being thoughtful about your requirements and asking the right questions to get the right tool for your needs, you can be more confident that you are making the best decision for your business.
Like what you’re reading? Sign up for our blog daily digest.