Weekend Reflection - Efficiency Theatre and the Tyranny of the Measurable


This episode is brought to you by Authentrend, providing biometric FIDO2 security solutions that make MFA actually work for small businesses. Check them out at authentrend.com/smallbizcyberguy


Pull up a chair. Let's talk about why we're all slowly destroying our businesses whilst congratulating ourselves on efficiency gains.

The Pattern

This week, we've examined the doorman fallacy from multiple angles:

Tuesday: The complete framework. Wednesday: £7 million lesson from the British Library, Thursday: Practical evaluation methods, Friday: £203,000 case study from Manchester

Same pattern every time:

Identify cost with an obvious function. Define role narrowly. Find a cheaper alternative. Cut original—Miss invisible value. Watch catastrophe unfold. Spend multiples of savings recovering.

British Library: Save £50,000, spend £7 million. Northgate Creative: Save £12,000, spend £203,000. Your business: Coming soon to this pattern if you don't learn from their mistakes.

The question isn't "what's the pattern?" The question is: Why does this pattern repeat endlessly despite overwhelming evidence?

The Psychological Trap

Humans are terrible at valuing things that are invisible.

We evolved to respond to immediate, visible threats. Tigers are visible. Starvation is tangible. Injuries are obvious.

Prevented disasters are invisible. Breaches that don't happen leave no trace. Security that works perfectly is indistinguishable from security that's unnecessary.

This creates systematic bias toward cutting things that actually matter.

Training that prevents breaches: Invisible value. Breaches that occur without training: Obvious cost.

MFA that blocks credential theft: Invisible protection. Ransomware recovery: Catastrophically visible expense.

Insurance that covers breach costs: Theoretical benefit. Uncovered expenses during breach: Brutally real costs.

We're psychologically wired to undervalue invisible protection until disaster makes it visible.

The Measurement Problem

Peter Drucker supposedly said, "What gets measured gets managed."

This is causing catastrophic damage across organisations.

Because not everything that matters can be measured, and not everything that can be measured actually matters.

Training completion rates: Measurable. Breaches prevented by trained staff: Unmeasurable.

MFA login friction: Quantifiable in seconds. Credential theft prevention: Impossible to count non-events.

Insurance premiums: Line item in budget. Insurance value: Unknown until a crisis.

The measurable things become proxies for value. Then we optimise proxies whilst destroying actual value.

The Political Economy of Cost-Cutting

Let's talk about incentives—they explain everything.

Who benefits from security cost cuts?

CFOs are evaluated on budget efficiency, Finance directors are measured on cost reduction, Board members are judged on profitability metrics, and Executives are compensated based on bottom-line improvements.

Who bears the risk when cuts backfire?

CISOs blamed for breaches IT staff handling incident response, Operations teams managing recovery, Customers affected by service disruption, Staff whose data gets leaked

The people making decisions don't bear the consequences of being wrong.

This creates moral hazard: decision-makers are incentivised to make cuts that look good in the short term whilst externalising long-term risks onto others.

CFO cuts £50,000 from the security budget. The board congratulates the efficiency. CFO gets a bonus.

Six months later, the breach costs £300,000. CISO blamed. IT blamed. "Why didn't you prevent this?"

CFO still has a bonus. Still has a reputation for efficiency. Consequences borne by others.

The Asymmetry of Credit

Consider the career incentives:

Cut £50,000 from security budget: Immediate recognition, Quantifiable achievement, Demonstrated efficiency, Political capital gained, Resume material, Promotion consideration

Prevent £300,000 breach through proper spending: No recognition (breach didn't happen) No quantifiable achievement (can't count prevented incidents) Looks like unnecessary spending, Political capital lost, Resume gap, Passed over for promotion.

Nobody gets promoted for preventing disasters that never occur.

This asymmetry systematically rewards the destruction of invisible value.

The Spreadsheet Delusion

Modern business management is spreadsheet-driven. Everything must fit into cells. Be quantifiable. Support cost-benefit calculations.

But real value often can't be spreadsheet-captured:

How do you quantify Dave from IT's institutional knowledge? You can't. Until he's gone and systems fail in ways nobody else understands.

How do you measure the value of training in preventing breaches? You can't. Until training stops and breaches happen.

How do you calculate vendor relationship benefits? You can't. Until you switch providers and context evaporates.

The spreadsheet contains measurable things. The spreadsheet is not reality. Confusing the two destroys businesses.

The Efficiency Theatre

"Efficiency" has become performative rather than substantive.

Actual efficiency: Achieving more valuable outcomes with fewer resources.

Efficiency theatre: Making visible cuts that look impressive in board reports whilst destroying invisible value that sustains operations.

Examples:

Replace the doorman with an automatic door. Visible savings: £35,000. Invisible cost: Loss of service, security, wayfinding, and customer experience. Net result: Revenue decline exceeding savings.

Cut security training. Visible savings: £12,000. Invisible cost: Loss of threat awareness, incident recognition, and security culture. Net result: £203,000 breach.

Remove MFA. Visible savings: £50,000. Invisible cost: Loss of credential protection, attack surface reduction, and compliance posture. Net result: £7,000,000 ransomware recovery.

This isn't efficiency. This is destruction disguised as management.

The Hindsight Problem

After every doorman fallacy failure, people say, "Obviously we should have kept that."

British Library: "Obviously we should have implemented MFA."

Northgate Creative: "Obviously we should have maintained training."

Your business next year: "Obviously we should have kept [whatever you're cutting now]."

But it wasn't obvious beforehand.

Beforehand, it looked like wasteful spending on invisible benefits. Looked like good management to cut it. Looked like efficiency gains.

Only in hindsight does the invisible value become obvious.

And by then you've already paid the catastrophic cost of learning this lesson.

The Institutional Amnesia

Organisations that experience doorman fallacy failures often repeat them in different domains.

Get breached after cutting training, implement an expensive training programme, and learn that lesson.

Then cut MFA for "practical reasons," experience credential theft.

Then, cancel insurance to "save money," and get breached without coverage.

Then replace the IT staff to "improve efficiency," and lose institutional knowledge.

Why? Because lessons learned in one domain don't transfer.

"Training matters" doesn't generalise to "all invisible value matters." Each security control gets evaluated independently. Each falls victim to same bias.

The Survivor Bias

"We haven't been breached, so clearly our security spending is excessive."

This is survivor bias at its most dangerous.

43% of UK businesses experienced breaches in 2025. That means 57% didn't.

The 57% who weren't breached conclude their security spending is wasteful.

They're lottery winners concluding that lottery tickets are good investments.

They're survivors of Russian roulette, concluding the gun isn't dangerous.

They're about to learn expensive lessons about probability and risk.

The Narrative Problem

Cost cuts make good stories:

"We identified £50,000 in wasteful security spending and reallocated resources to revenue-generating activities."

Sounds brilliant. Demonstrates business acumen. Shows tough decision-making.

Prevented disasters don't make stories:

"We spent £50,000 on security controls and nothing happened."

Sounds like wasted money. Demonstrates paranoia. Shows poor resource allocation.

The narrative structure of business success favours visible action over invisible protection.

Breaking the Cycle

How do we escape this trap?

Stop measuring efficiency by cost reduction. Start measuring by value preservation.

Stop rewarding visible cuts. Start rewarding prevented disasters even though they're invisible.

Stop optimising spreadsheets. Start optimising actual outcomes.

Stop defining roles narrowly. Start understanding complete value propositions.

Stop treating security as a cost centre. Start treating it as insurance against catastrophic loss.

Good luck implementing any of that in real organisations with real incentive structures.

The Brutal Reality

Here's what actually happens:

CFOs continue cutting security costs because that's what gets them promoted.

Boards continue demanding efficiency because that's what shareholders reward.

Organisations continue to fall for the doorman fallacy because the incentive structure guarantees it.

Breaches continue to occur because nobody fixes the systemic bias toward destroying invisible value.

And consultants like me continue writing case studies documenting predictable failures.

What You Can Actually Do

Given that organisational incentives won't change, what can individual security professionals do?

Document invisible value explicitly. Force the intangible into tangible form through regular reporting.

Calculate expected costs. Make theoretical risks concrete by adjusting costs for probability.

Use comparable cases. British Library's £7 million lesson is your political ammunition.

Shift risk assessment framing. "If we're wrong about this cut, what's the cost?" makes downside explicit.

Build political capital through small visible wins. Then spend it defending invisible value that matters.

Accept that you'll lose some battles. Choose which hills to die on strategically.

Document everything. When cuts backfire, ensure attribution is clear.

The Uncomfortable Truth

Most businesses reading this week's content will make doorman fallacy mistakes anyway.

Not because they don't understand the pattern. They do.

Not because they can't calculate expected costs. They can.

Not because they lack examples of failures. British Library, Northgate Creative, and hundreds of others provide clear warnings.

They'll make mistakes because organisational incentives reward short-term visible cuts over long-term invisible protection.

And they'll learn expensive lessons individually because each organisation thinks they're unique.

For Tomorrow

But before we move on, sit with this question over your weekend:

What invisible value is your organisation currently considering cutting to achieve visible efficiency gains?

Calculate the expected cost correctly. Understand what actually gets destroyed. Recognise that measurement bias favours catastrophic decisions.

Then decide if you want to be this decade's British Library, learning £7 million lessons about invisible value.

Or if you want to be boring and competent, maintaining protection that works invisibly, preventing disasters nobody celebrates, keeping your business operational.

The doorman does more than open doors.

Your security spending does more than its measurable function.

And you won't understand what it actually did until it's gone, and you're spending multiples of that recovering from predictable disasters.

Choose wisely. Your business survival depends on it.


Noel Bradford

Noel Bradford – Head of Technology at Equate Group, Professional Bullshit Detector, and Full-Time IT Cynic

As Head of Technology at Equate Group, my job description is technically “keeping the lights on,” but in reality, it’s more like “stopping people from setting their own house on fire.” With over 40 years in tech, I’ve seen every IT horror story imaginable—most of them self-inflicted by people who think cybersecurity is just installing antivirus and praying to Saint Norton.

I specialise in cybersecurity for UK businesses, which usually means explaining the difference between ‘MFA’ and ‘WTF’ to directors who still write their passwords on Post-it notes. On Tuesdays, I also help further education colleges navigate Cyber Essentials certification, a process so unnecessarily painful it makes root canal surgery look fun.

My natural habitat? Server rooms held together with zip ties and misplaced optimism, where every cable run is a “temporary fix” from 2012. My mortal enemies? Unmanaged switches, backups that only exist in someone’s imagination, and users who think clicking “Enable Macros” is just fine because it makes the spreadsheet work.

I’m blunt, sarcastic, and genuinely allergic to bullshit. If you want gentle hand-holding and reassuring corporate waffle, you’re in the wrong place. If you want someone who’ll fix your IT, tell you exactly why it broke, and throw in some unsolicited life advice, I’m your man.

Technology isn’t hard. People make it hard. And they make me drink.

https://noelbradford.com
Next
Next

UK Case Study - The Manchester Marketing Agency That Cut Training and Lost Everything