top of page

Assessing your data function with economic thinking

The conversations around amplifying your data capability can be exciting as you consider new technologies: scalability, real-time analytics, AI-driven insights. But without concerted assessments of cost we can’t fulfil our organisational obligations to maximise enterprise value. And forget the sticker price you negotiate with vendors; that’s just one part. The total expense encompasses a trifecta of capital expenditures (CapEx), operational expenditures (OpEx), and the slippery hidden costs such as learning curve, productivity-lowering events, project failure, vendor repositioning: the list goes on.


Gartner research found that less than half of all data analytics teams effectively provide value to their organisation. That’s a huge failure. And that’s before we even start to talk about the costs of those teams.


Why is this so arduous? It’s not just about buying a software suite and watching the magic happen. You’re also investing time, talent, and in most cases, altering your entire operational model to make room for this new pipeline. These complexities are shape-shifters. One moment they’re a line item on a spreadsheet; the next, they’ve spiraled into unexpected compliance roadblocks or the monthly fine-tuning of that machine learning model you were told was “set it and forget it.”


If you think in terms of sunk costs, you may feel trapped by the magnitude of the financial commitment. But viewed through the lens of optionality, each decision point is a chance to pivot, adapt, and find leverage.


What about those phantom expenses, the ones that don’t neatly fit into Excel cells? No, they aren’t figments of your imagination. These are real costs, often insidious in nature, stealthily eroding the ROI you so confidently projected in that initial pitch meeting. These are the costs that people conveniently ignore, primarily because they don’t align with the neat financial models everyone’s so fond of. It’s far easier to sidestep these incalculable factors than wrestle them into your forecasts. But it’s precisely these complexities that can make or break the success of your data analytics capability.


Got it? Let’s move on.



The Limitations of Conventional Frameworks and the Call to Action for Technology Leaders

The go-to models like “total cost of ownership” (TCO) propagated by consultancies such as McKinsey or Gartner have their merits, but let’s be blunt: they’re often inadequate. These frameworks excel in quantifying direct costs — hardware, software, and personnel — but flounder when grappling with nuanced variables. Risks of project failure, the cost implications of false positives/negatives in analytics, or the agility tax — being too slow to adapt to new data paradigms — are seldom considered.


As a technology leader, the onus isn’t merely on overseeing a set of tasks; it’s a clarion call to maximise the value-to-cost equation. Ensuring costs don’t spiral out of control is your remit. Why? Because when costs balloon, it’s not just a budgeting snafu; it’s a strategic failure. An imbalanced value-to-cost ratio not only strains financial resources but also misdirects human capital and squanders time, the most perishable of all resources.


Conventional frameworks also commit another cardinal sin: they don’t account for the hidden arsenal within your existing infrastructure. You have databases languishing with untapped potential, tools capable of more than they’re currently tasked with, and human expertise that’s severely underutilised. The value of these existing assets, if leveraged judiciously, can offset your analytics investment substantially.


Even more, new technologies like large language models (consider chatGPT as an example) remain conspicuously absent from most TCO calculators, even though they hold promise in augmenting analytics capabilities. This is like calculating the speed of a car but ignoring the turbocharger under the hood.


To bring it all home, if you’re not continually recalibrating your approach to stay in tune with these multi-dimensional factors, you’re not just maintaining the status quo; you’re actively driving down the ROI. And in a world that measures the pace in microseconds and values agility over mere efficiency, it’s importance shouldn’t be underestimated.

Alright, conventional models have their gaps. What’s the alternative?



A New Lens — Focusing on Cost to Value Inefficiencies through Arbitrage

Shifting the narrative requires a re-evaluation of how we approach the cost and value equation of data analytics capabilities. It’s time to introduce a new metric: Arbitrage. By evaluating your data resources as if they’re commodities traded in a marketplace, you can uncover inefficiencies in your setup that are ripe for improvement.


Arbitrage involves identifying discrepancies between what you’re paying for your data analytics capabilities at each stage of its lifecycle and what these stages are worth in the market. This approach extends beyond just data storage or processing costs to also include the cost of generating actionable insights and even the ultimate delivery of these insights to decision-makers.


The concept underscores the importance of pinpointing exactly where your costs outweigh the value provided, and vice versa. Instead of using monolithic cost metrics, breaking it down to these finer stages allows for a much more dynamic and responsive strategy.


Let’s consider the example of data processing. The cost of acquiring raw data might be relatively low for your organization, but what about refining it into something actionable? If the market offers solutions that are more efficient than your in-house capabilities, that’s an arbitrage opportunity. Similarly, if your analytics are generating insights that are significantly more valuable than the market rate, you have a unique asset that could be further leveraged.

What makes this concept powerful is its built-in feedback loop. If you keep an eye on market rates and trends, your understanding of your own capabilities in relation to the market continually refines itself. This allows for a more agile and adaptive approach to maintaining, or ideally improving, your cost-to-value ratio.


The value of this approach lies in its clarity. It’s not about maintaining a budget; it’s about understanding the market dynamics at play, allowing you to make better-informed decisions. The end goal is to use this arbitrage lens as a tool for realigning resources to gain maximum returns.



The Cost/Value Heatmap — A Pragmatic Tool for Actionable Insights

Enough of theoretical posturing; let’s talk about how to operationalise this concept of arbitrage. Enter the Cost/Value Heatmap — a tool designed to tangibly assess your data analytics capabilities vis-a-vis market benchmarks.


Creating this heatmap involves mapping each segment of your data analytics pipeline against corresponding market values. Whether it’s data acquisition, storage, processing, analysis, or dissemination of insights, each element should have its cost and value markers. Once populated, this heatmap serves as an illuminating guide for decision-making.


For example, let’s say you discover that building in-house automated extract, load, and transform (ELT) tools would be disproportionately costly, while market offerings are competitively priced, even after accounting for staffing and project management costs. You’ve just pinpointed an arbitrage opportunity where reallocating resources could result in outsized returns.


The true power of the heatmap is its capacity to reframe your perspective. Traditional ‘cost centres’ can be seen anew as potential investment opportunities for higher ROI. Moreover, this isn’t a one-off activity. The landscape of data analytics is in constant flux; keeping your heatmap updated will allow you to seize arbitrage opportunities as they arise.


But what about the caveat of dynamic change? Given the rate at which data paradigms are evolving, your heatmap can become obsolete almost overnight. To counter this, don’t just park this tool in your annual review toolkit. Make it a staple in your quarterly, if not monthly, review cadence. Adapt or perish; those are the stakes.


The actionable step here is to initiate the process of creating your Cost/Value Heatmap. It’s a living document that should become a cornerstone of your strategic thinking in data analytics, making the concept of arbitrage a practical tool rather than an abstract principle.



How to Build Your Cost/Value Heatmap: A Step-by-Step Guide

A Cost/Value Heatmap doesn’t materialise out of thin air. It requires concerted effort and strategic thinking. Here’s how to build one.


1. Inventory Existing Assets and Processes: List all the tools, technologies, and processes that are part of your data analytics pipeline. From raw data sources to the final dashboards where insights are displayed, capture everything.


2. Identify Costs: For each listed item, attach all associated costs — hardware, software, manpower, third-party services, etc. Don’t forget about less-obvious costs like the price of suboptimal decisions made from poor data quality.


3. Estimate Value: This is the trickier part. Use market benchmarks and internal data to assign value to each component. Look at metrics like the speed of generating insights, the impact on revenue, and any competitive advantage gained.


4. Map Costs and Values: Now comes the visualisation. Use any data visualisation tool to create the heatmap. On one axis, cost, the other axis, value. Plot each asset or process as a point, and you’ll have 4 quadrants to assess your value-to-cost opportunities.


5. Identify Arbitrage Opportunities: Highlight areas where cost and value are mismatched when compared to market averages. These are your points of focus for realignment.


6. Prioritise: Not all arbitrage opportunities are born equal. Apply an 80/20 rule to focus on the high-impact areas first. These should be addressed as a priority during your strategic planning.


7. Action Plans: Once you’ve prioritised, assign teams to come up with specific actions to correct these imbalances. This could range from renegotiating vendor contracts to switching to more cost-effective open-source solutions.


8. Review and Update: The heatmap is not a static entity. Make it a point to revisit it periodically to adjust for market shifts and internal changes.


Incorporate the heatmap into your regular review cycles and it’ll evolve into a robust mechanism for ongoing value optimisation. It serves as a quantifiable measure for making informed decisions without shooting in the dark. Thus, it should be a top priority when planning strategically.



Continuous Improvement — The Iterative Nature of Cost/Value Analysis

Your Cost/Value Heatmap is a potent tool, but its efficacy hinges on its currency. Given the ever-shifting dynamics of the data analytics landscape, this isn’t a set-and-forget instrument. Treat it as a living document that undergoes periodic re-evaluations. Here’s how:


  • Scheduled Reviews: Embed Cost/Value Heatmap evaluations into your quarterly strategic reviews, at a minimum. This ensures alignment with your overall corporate strategy and budgets.

  • Variable Monitoring: Establish a set of key performance indicators (KPIs) that will signal when it’s time to reassess particular elements of the heatmap. For example, a sudden uptick in server costs or a dip in analytical throughput could trigger an immediate review.

  • Responsibility: If you’re a technology leader, you bear the onus of ensuring costs don’t spiral uncontrollably. Your charge isn’t merely about maintaining infrastructure; it’s about refining the value-to-cost equation relentlessly.

  • Market Surveillance: Keep a tab on market shifts. New technologies and pricing models appear frequently, and you’ll want to seize these arbitrage opportunities as they arise.

  • Feedback Loops: Cultivate channels for receiving feedback from the workforce actively using these tools and systems. They are often the first to spot inefficiencies or suggest possible improvements.


In this game, the status quo is your enemy. Your heatmap needs to evolve with the market, internal capabilities, and objectives. Failing to update it is tantamount to working off outdated blueprints; it’s not just suboptimal, it’s a lapse in responsibility.


So, as you recalibrate your approach to forecasting the costs of your data analytics capabilities, remember this: In a landscape where the tools and technologies are ever-changing, the real competitive advantage lies in the ability to adapt — and to do so smarter and faster than the rest. The Cost/Value Heatmap isn’t just a tool for financial prudence; it’s a litmus test for your agility and adaptability in a complex ecosystem. Make it a top priority in your strategic planning and watch how it not only saves costs but also acts as a catalyst for innovation and growth.

bottom of page
chatsimple