The McNamara Fallacy and Your Reporting Dashboard

Robert McNamara was the US Secretary of Defense during Vietnam. He was also, by most accounts, one of the most intelligent people to ever hold the job. Harvard MBA. President of Ford Motor Company at forty-four. A man who believed deeply, almost religiously, in data.

He ran the war on metrics. Body counts. Kill ratios. Sorties flown. Bombs dropped. Every week, the numbers went up. Every week, by the numbers, America was winning. The charts looked fantastic. The dashboards were green.

They lost the war.

The problem wasn't that McNamara measured things. The problem was that he measured what was easy to measure, ignored what wasn't, and then convinced himself that only the measurable stuff mattered. The sociologist Daniel Yankelovich later formalised this into what's now called the McNamara Fallacy, and it has four steps: measure what's easy to measure, disregard what can't be easily measured, presume what can't be measured isn't important, and then presume what can't be measured doesn't exist.

I think about this every time someone shows me a marketing dashboard.

The Dashboard Problem

I work with ecommerce agencies and SaaS companies. Every single one of them has a reporting dashboard. Most of them are beautifully designed. Clean graphs. Colour-coded KPIs. Weekly snapshots. Monthly trends. Impressions, clicks, open rates, CTR, bounce rate, sessions, conversions, MQLs, SQLs, and whatever acronym someone invented last quarter to make pipeline meetings feel more scientific.

And look, some of that is useful. I'm not about to argue that you shouldn't measure anything. That would be stupid. But here's what I notice, consistently, across almost every marketing function I get close to: the dashboard is full of things that are easy to count, and completely silent on the things that actually drive growth.

Nobody's measuring how many times your company got mentioned in a Slack channel you're not in. Nobody's tracking the conversation where an agency partner recommended your platform to a merchant over dinner. Nobody's quantifying the moment a prospect read your CEO's LinkedIn post six months ago, forgot about it, then remembered your name when a need arose and Googled you directly.

Those things are driving your business. Your dashboard can't see them. And because your dashboard can't see them, your team is slowly being optimised into ignoring them.

What Gets Measured Gets Managed (Into the Ground)

There's a second-order effect of the McNamara Fallacy that's even more dangerous than the measurement problem itself: it changes behaviour.

When your team is judged on metrics that are easy to track, they optimise for metrics that are easy to track. Your content person writes for organic traffic because that's what the dashboard rewards, not for the piece that would get shared in fifty private group chats but might only get three hundred page views. Your social media manager posts what gets impressions rather than what builds an actual point of view. Your email marketer optimises subject lines for open rates rather than writing something that makes someone think differently about your company.

None of these people are doing anything wrong. They're being rational. They're responding to the incentive structure you've created. The dashboard told them what mattered, and they believed it.

I've seen this play out with agencies in particular. An agency will produce a brilliant, opinionated piece of content that positions them as a genuine authority in their space. It gets average traffic. The dashboard says it underperformed. So they go back to writing "10 Tips for Better Email Marketing" because that's what the numbers reward. Six months later they're wondering why their pipeline is full of low-value leads who found them through a generic search query and have no idea what makes them different.

The dashboard didn't lie. It just told a very incomplete story, and the team mistook it for the whole truth.

The Things That Actually Matter (and How to Think About Them)

Here's where I'm supposed to give you a neat alternative framework. A new set of metrics. A better dashboard. But that would be doing exactly what McNamara did, just replacing one set of easy measurements with another set and pretending the problem is solved.

The honest answer is that the most important things in marketing are resistant to measurement. That doesn't mean you throw your hands up and stop trying. It means you hold your measurable metrics with the appropriate level of scepticism and supplement them with something else: judgement.

Here's how I think about it with the agencies and SaaS companies I work with.

Track the leading indicators you can see, but don't worship them. Organic traffic, engagement rates, email performance; these are signals, not verdicts. They tell you something is happening. They don't tell you whether the right thing is happening. A blog post with modest traffic but a 12-minute average read time is probably doing more for your brand than one with ten times the sessions and a 40-second bounce.

Pay attention to the conversations you can't track. Ask your sales team what prospects are saying on calls. Are they mentioning content? Do they reference something specific? Are they arriving already warm, or are they cold and need educating from scratch? This is qualitative data and it's messy and you can't put it in a pie chart, but it tells you more about whether your marketing is working than your click-through rate ever will.

Monitor what your ecosystem is doing with your content. Are agency partners sharing your stuff? Are people tagging you in posts? Are you getting inbound requests to collaborate, speak, or contribute? These are signals of brand gravity — evidence that you've created something people want to be associated with. No dashboard tracks this. You have to actually look.

Run the "would anyone miss it?" test. For every recurring piece of content you produce, ask: if we stopped doing this tomorrow, would anyone notice? Would anyone ask where it went? If the answer is no, you're producing content for your dashboard, not for your audience. Kill it and spend that time on something that passes the test.

Accept that some of the best marketing can't be attributed. The CEO who posts something sharp on LinkedIn three times a week and slowly becomes the person people associate with your space — you'll never attribute a closed deal to post number forty-seven. But the compound effect is real, and pretending it isn't because you can't drop it into a spreadsheet is the McNamara Fallacy in action.

The Uncomfortable Truth

The reason the McNamara Fallacy persists in marketing isn't because people are stupid. It's because measurement feels like control. When you can point at a number and say "this went up," it feels like you understand what's happening. It feels like you're managing the situation. It feels scientific.

The alternative — admitting that a significant chunk of what drives growth is unmeasurable, and that your job involves making judgement calls based on incomplete information — is genuinely uncomfortable. Especially when someone in a board meeting asks "what's the ROI on that?" and the honest answer is "I believe it's significant but I can't prove it with a number."

Nobody wants to say that. So we build dashboards instead. We fill them with things we can count. We make them look authoritative. And we slowly, incrementally, optimise our marketing toward the things that are easy to prove and away from the things that actually matter.

McNamara had body counts. You have MQLs. The chart's going up. The question is whether you're winning.

I'm Will, founder of Waye. We help ecommerce agencies and SaaS companies focus their marketing on the things that actually build a brand — even when the dashboard can't see them. If your reporting is great but your pipeline isn't, that might be the problem.

Reply

Avatar

or to participate

Keep Reading