
When Data Is Right but Decisions Go Wrong
Strategy fails not from lack of information, but from outdated assumptions about what it implies


TL;DR
Strategic failures often stem not from absent data, but from outdated interpretations of its meaning. This insightful article distinguishes between ontology—your organization's stable data categories—and semantics—what those categories actually mean and how they drive action. Compelling examples, from historical epidemics to the 2008 financial crisis, reveal how decisions falter when the practical significance of seemingly reliable data drifts from underlying conditions. For business leaders, the message is clear: established classifications, like risk ratings or performance metrics, can outlive their true meaning. Critically, AI systems will amplify these semantic misinterpretations. Proactively challenging the foundational assumptions and meaning behind your data is paramount for robust decision-making and avoiding costly strategic missteps.

In the summer of 1854, cholera tore through London. Entire neighbourhoods were struck within weeks. Mortality rose sharply. Panic followed. The prevailing explanation was miasma—disease spreading through bad air—and the observable patterns appeared to support that belief.
Yet people kept dying.
John Snow, a London physician, noticed something the dominant theory could not explain. When he plotted the deaths on a simple map, they clustered around a single public water pump on Broad Street. When the pump handle was removed, infections in that area declined.
Snow did not discover new data. He changed how existing data was interpreted.
Strategy failures today often resemble that moment: the facts are visible, but the meaning attached to them is wrong.
Ontology and Semantics
Ontology concerns what exists in an organisational world—the categories, metrics, and classifications that structure reality. Think customer segments, risk ratings, performance bands, compliance status, strategic priorities. These categories determine what gets measured, reported, funded, and governed.
Semantics concerns what those categories mean in practice and how they translate into action.
Organisations function through an interplay of nouns and verbs. The nouns stabilise reality and enable coordination at scale. The verbs move the system: lend, approve, invest, expand, prescribe, incentivise, withdraw.
Semantics is the bridge between the noun and the verb. It determines what we do because something carries a particular label. When that bridge shifts, behaviour shifts—even if the underlying data remains correct.
When Categories Outlive Their Conditions
The financial crisis of 2008 offers a stark example. Complex securities carried AAA ratings derived from sophisticated models and historical data. Within those assumptions, the ratings were internally coherent. Over time, however, AAA came to imply safety across contexts: regulatory safety, capital efficiency, systemic resilience. That interpretation depended on assumptions about correlation.
The ontology did not change overnight. What changed was the meaning attached to the rating. When correlations broke down during the crisis, the rating categories persisted but their behavioural implications no longer matched reality. Capital continued to flow under an outdated interpretation of risk.
Healthcare provides a parallel. In the 1990s, pain began to be treated as a vital sign. Hospitals routinely recorded patient-reported pain scores, and satisfaction surveys increasingly included questions about pain control. These measures were tracked closely and, in some systems, linked to incentives. The category of pain management remained stable; what shifted was how success was defined. Lower reported pain and higher satisfaction became visible markers of quality. Prescribing behaviour adapted accordingly, and the consequences unfolded over time.
A more contemporary example lies in performance management. Many organisations adopted employee engagement scores as indicators of cultural health and productivity. The category was measurable and consistent. Over time, improvement in survey scores began to stand in for genuine cultural progress, even when underlying incentives and behaviours remained unchanged. Visible movement in metrics became a proxy for deeper organisational health.
In each case, information was not absent. The meaning attached to it had drifted.
AI and the Acceleration of Ontology
Artificial intelligence intensifies this challenge.
AI systems excel at classification. They score, rank, cluster, and predict at scale. Risk probabilities, customer segments, fraud alerts, diagnostic suggestions, churn likelihood—these categories become sharper and more consistent across an organisation. In effect, AI stabilises ontology.
Consider a bank deploying AI-driven credit scoring. The model analyses thousands of variables and produces highly accurate default probabilities. The data is robust, and the model performs well under current conditions. Over time, however, market structures change, borrower behaviour shifts, and economic correlations evolve. The score may remain statistically valid within its trained boundaries, but the action thresholds attached to that score—lending limits, pricing bands, capital allocation—may no longer reflect current realities. If decision rules remain anchored to earlier assumptions, risk accumulates quietly.
As AI becomes embedded in credit decisions, healthcare triage, compliance monitoring, supply chains, and strategic dashboards, organisations gain precision in classification. The vulnerability lies in assuming that better categories automatically produce better decisions. Strategy falters when yesterday’s meanings are applied to today’s conditions.
In an AI-shaped world, truth may reside in the numbers. Advantage will belong to organisations that can update meaning faster than they update data.
The Symptoms of Semantic Drift
Semantic drift rarely announces itself directly. It shows up as organisational friction. Definitions become contested. Exceptions multiply. Workarounds harden into informal policy. Escalations increase because interpretation feels unstable. Authority recentralises in response to ambiguity.
“In an AI-shaped world, truth may reside in the numbers. Advantage will belong to organisations that can update meaning faster than they update data.”
By the time these symptoms are visible at the executive level, adaptability has already weakened.
Snow’s intervention worked because he questioned the interpretive frame, not the data itself. The deaths were visible to everyone. The explanation was not.
What Leadership Looks Like
Leaders can begin by examining the verbs attached to their most consequential nouns. What actions follow when something is labelled low risk, high potential, strategic, or compliant? Under what conditions were those action rules defined? When might they need revision?
Boundary conditions should be explicit. Categories should be stress-tested against shifts in correlation, incentives, or behaviour. Strategy conversations should revisit foundational definitions, not only performance against them.
In practice, this may involve revisiting decision thresholds attached to AI outputs, rotating teams across functions to challenge assumptions, or conducting periodic reviews of whether critical labels still map to the right actions.
Early signs of strain deserve attention: rising definitional disputes, growing reliance on exceptions, increasing requests for escalation. These often signal that the link between signal and action is loosening.
Holding Meaning Loosely
John Snow’s map endures because it shows how transformation often begins — not with new information, but with reinterpretation. The facts were present. The meaning had to change.
In environments saturated with data and increasingly organised by AI, the strategic challenge is maintaining alignment between what exists and what it implies.
For leaders, the discipline is straightforward but demanding: revisit the meaning behind the metrics, re-examine the action rules attached to core categories, and treat semantic drift as a strategic risk, not an operational nuisance.
Strategy fails less from lack of information than from failure to renew meaning.

Founding Fuel is sustained by readers who value depth, context, and independent thinking.
If this essay helped you think more clearly, you may choose to support our work.

Founding Fuel is sustained by readers who value depth, context, and independent thinking.
If this essay helped you think more clearly, you may choose to support our work.


Join the conversation
Debleena Majumdar
Entrepreneur & business leader | Author
Debleena Majumdar is an entrepreneur, business leader and author who works at the intersection of narrative, numbers, and AI. She believes that in a world where AI can generate infinite content, the differentiator is not volume, it’s meaning: the ability to connect strategy to a coherent story people can trust, follow, and act on.
She is the co-founder of stotio, an AI-powered Narrative OS built to help businesses distil strategy into connected and clear growth narratives across moments that shape outcomes be it fundraising, sales, brand evolution, and leadership reviews. stotio blends structured storytelling frameworks with a context-driven intelligence layer, so organizations build narrative consistency across stakeholders and decisions.
Debleena’s foundation is deeply rooted in finance and investing. Over more than a decade, she worked across investment banking, investment management, and venture capital, with experience spanning firms such as GE, JP Morgan, Prudential, BRIDGEi2i Analytics Solutions, Fidelity, and Unitus Ventures. That grounding in capital and decision-making continues to shape her work today: she is drawn to the point where metrics end and decisions begin and where leaders must translate complexity into conviction.
Alongside business, Debleena has been a published author, with multiple fiction and non-fiction books. She contributed data-driven business articles, including contributions to The Economic Times over several years. She loves singing and often creates her own lyrics when she forgets the real ones. Humour is her forever panacea.
Across roles and mediums, her learning has been to use narrative with numbers, as a clear strategic tool that makes decisions clearer, communication sharper, and growth more aligned.
Arjo Basu
Systems thinker & technologist | Entrepreneur
Arjo Basu is a systems thinker, technologist, and entrepreneur working at the intersection of narrative, data, and AI. He believes the future of work, and leadership, depends on how well we humanize technology while building structures that can scale trust, clarity, and opportunity.
With over 25 years of experience across data strategy, enterprise architecture, and AI-led product innovation, Arjo has spent his career designing systems that bridge people, platforms, and purpose. His work is guided by a simple belief: systems thinking, when paired with the right technology and a clear narrative, leads to sustained impact.
He founded Moksho, an AI-powered interview intelligence platform reimagining how we hire and how we prepare to be hired through simulated scenarios, sharp feedback, and credibility-building certifications.
He is the co-founder and CTO of stotio, an AI-powered Narrative OS built to help businesses distil strategy into connected and clear growth narratives across moments that shape outcomes be it fundraising, sales, brand evolution, and leadership reviews. stotio blends structured storytelling frameworks with a context-driven intelligence layer, so organizations build narrative consistency across stakeholders and decisions.
Previously, Arjo served as a Principal Data Architect and Strategist for global financial services firms in the United States, where he led high-performance teams across geographies, built enterprise-grade data platforms on Snowflake and Databricks, and created the Data Maturity Framework, now used by multiple organizations to guide scalable, insight-led transformation.
Alongside his technology work, Arjo writes fiction, poetry, and essays that explore identity, memory, and belonging, often mirroring the same questions he engages with in systems and strategy: how structure shapes behaviour, how silence carries meaning, and how humans navigate complexity.
Across technology, narrative, and design, his work reflects a commitment to building systems with structure, clarity and momentum.
Beyond the noise is the signal.
FF Insights: Sharpen your edge, Monday–Friday.
FF Life: Culture, ideas and perspectives you won't find elsewhere — Saturday.
Readers also liked

FF Daily #404: When your company is like a city or a nation
June 22, 2021: Reid Hoffman and Chris Yeh on blitzscaling; India’s vaccine barons; The poaching wave; The disciple

Founding Fuel
FF Daily #404: When your company is like a city or a nation
June 22, 2021: Reid Hoffman and Chris Yeh on blitzscaling; India’s vaccine barons; The poaching wave; The disciple

When AI Writes the Code, Who Guards the System?
As AI-assisted coding accelerates software development, enterprises face a new challenge: ensuring governance, accountability, and safety keep pace with machine-speed innovation

Chirantan Ghosh
Seasoned technologist | Growth architect and business leader
When AI Writes the Code, Who Guards the System?
As AI-assisted coding accelerates software development, enterprises face a new challenge: ensuring governance, accountability, and safety keep pace with machine-speed innovation

Seasoned technologist | Growth architect and business leader

When AI Raises the Average, Who Protects the Breakthrough?
Artificial intelligence is superb at optimisation. But transformation still comes from the deviations leaders choose to back.

Debleena Majumdar
Entrepreneur & business leader | Author
When AI Raises the Average, Who Protects the Breakthrough?
Artificial intelligence is superb at optimisation. But transformation still comes from the deviations leaders choose to back.

Entrepreneur & business leader | Author
Explore more
Dive into other themes from our network.
