Executive Summary
Most organizations operate on two parallel systems: the formal structure visible on the org chart and the informal one — built from daily decisions, AI-accelerated workflows, and cross-functional networks — where work actually happens. The two systems are increasingly out of alignment.
McKinsey's 2025 global survey of 2,000 executives found that roughly two-thirds have undergone an operating model redesign in the past two years, yet success still hinges on one underinvested factor: explicitly redesigning how decisions are made and who owns them — not just who reports to whom.
The average manager's span of control has grown nearly 50% since 2013, reaching 12.1 direct reports in 2025, while 97% of managers also carry individual contributor responsibilities. That is not a design choice. It is a system failure presenting as a job description.
AI has not simplified this. It has accelerated the divergence — enabling faster local decisions that, absent explicit decision architecture, accumulate into organizational incoherence. The California Management Review calls this "functional asynchrony": local hyper-efficiency that degrades collective coherence.
The path forward is not a reorg. It is treating the operating model as a living system with four addressable components: decision architecture, workflow reality, role load, and network health.
Table of Contents
The Setup
A CFO approves a $2 million automation budget in January. By March, three separate teams have independently purchased different tools — all with legitimate local approvals — all solving the same problem in conflicting ways. The CFO never saw it coming. The org chart says she controls capital allocation. The reality is that by the time a formal request reaches her desk, the decision has already been made in Slack channels, project management epics, and informal conversations she doesn't touch.
No one violated policy. No one was insubordinate. The organization simply operates on a different model than the one documented.
This is not a compliance problem or a communication failure. It is a structural one. The operating model that executives believe they are managing — the one with clean reporting lines, approved budgets, and quarterly planning cycles — no longer describes how work flows, where decisions are made, or how ideas move. What actually governs execution is a parallel system: the informal networks where coordination happens, the workflows that AI and tools have rewired without authorization, and the daily judgment calls managers make when no one is watching.
The mismatch between the formal model and the real one is not new. But three forces have made it dangerously wider: the collapse of weak ties in hybrid work, the acceleration of local decision-making enabled by AI, and the quiet expansion of managerial load that has turned the people responsible for organizational coherence into overloaded intermediaries. Together, they have produced an operating model that looks credible on a board deck and fails in practice.
The question worth asking is not why execution is slow. It is why leadership keeps tuning a map that no longer shows the terrain.
The Context
The operating model — how a company structures decisions, workflows, roles, and resources to execute its strategy — has always lagged behind the work it is supposed to govern. That lag is not new. What is new is the rate at which the divergence is widening, and the systemic forces driving it.
For most of the 20th century, formal structure was a reasonable proxy for how work actually happened. Hierarchy and physical co-location enforced alignment between the org chart and the operating reality. The informal network existed but was bounded by geography — the office floor, the department hallway, the executive suite.
That containment ended with distributed work. Hybrid and remote structures severed the physical proximity that once reinforced formal accountability. Microsoft's 2022 Work Trend Index found that 59% of hybrid employees and 56% of remote employees reported fewer work friendships — a proxy for the erosion of the weak ties that carry novel information across organizational boundaries.
The rigorous data comes from MIT's Senseable City Lab, which analyzed the email network of 2,834 MIT researchers over 18 months. When the campus shifted to remote work, weak ties dropped by 38% within days — translating to more than 5,100 lost connections over the period. Strong within-team ties held or strengthened. The result was a network that became more insular and substantially less capable of generating or transmitting novel ideas. The org chart was describing an organization that no longer existed.
AI has compounded the problem in a direction few leaders anticipated. The intuitive expectation was that AI would reduce coordination overhead by automating decisions and streamlining workflows. The actual data suggests the opposite dynamic. Asana's 2025 AI Super Productivity Paradox report found that 90% of self-described high-productivity workers said AI creates more coordination work, not less, with 62% reporting quality issues requiring rework. Faros AI's telemetry analysis of over 10,000 developers — published alongside the 2025 DORA Report — found that while AI tools increased individual task completion by 21% and pull request volume by 98%, software delivery performance metrics remained flat. Code review time increased 91%. Bug rates climbed 9%. Individual productivity accelerated. Organizational throughput did not.
The California Management Review describes the mechanism precisely: when AI enables local tasks to be executed faster than downstream actors can absorb or process them, the result is saturation, broken workflows, and desynchronization. "A task can thus become counterproductive if its productivity exceeds what the rest of the system can absorb." This is not a technology failure. It is an operating model failure. AI amplifies what is already present — the strengths of well-designed organizations and the dysfunctions of poorly designed ones.
McKinsey's June 2025 survey of 2,000 executives across 16 sectors found that two-thirds of organizations have undergone an operating model redesign in the past two years — yet until recently, only 21% of redesigns led to improved performance. The persistent failure mode was consistent: leaders changed the reporting structure without rewiring the decision architecture and workflows underneath it. They moved the boxes and left the system intact.
The Analysis
The Real Operating Model Has Four Components — None of Them on the Org Chart
If the org chart does not describe how work actually flows, what does? Research across organizational behavior, network science, and management strategy consistently points to four components that together constitute the real operating model: decision architecture, workflow reality, role load, and network health. Most organizations actively manage none of them.
Decision architecture is the clearest case of deliberate neglect. Organizations typically know who owns P&L. They rarely know who owns the 15 to 20 recurring decisions — product prioritization calls, resource allocation tradeoffs, vendor selection thresholds, exception approvals — that actually determine execution speed and coherence. When those ownership rights are ambiguous, work does not stop. It continues through informal channels, with whoever has the confidence or authority to make the decision locally. McKinsey's updated operating model framework, the "Organize to Value" system, names decision rights explicitly as one of 12 elements that must be managed as a system. The firm's 2025 data show that following more than six of the refreshed redesign rules — now centered on alignment, decision rights, and workflow rewiring — yields a 95% success rate. Following the older rules focused primarily on structure: 55%.
The implication is not that structure doesn't matter. It is that structure without explicit decision ownership that creates a power vacuum. And power vacuums do not stay empty. They fill with whoever moves fastest.
Workflow reality is the second gap. Every organization has a documented process and an actual one. The documented process reflects how the system was designed to work — typically years ago, by people who are no longer there, for a technology stack that has since been replaced. The actual workflow is what people do to get things done: the workarounds, the shortcut approvals, the informal checkpoints that exist because the official ones are too slow or too abstract. AI has accelerated the divergence between the two. Teams deploy AI tools to speed up their portion of a workflow without visibility into how that acceleration affects upstream inputs or downstream absorptive capacity. The result is what the CMR calls "functional asynchrony" — speed in one node that desynchronizes the system it is part of.
High-performing organizations do not layer AI onto existing processes. They redesign the process first, then deploy AI into the redesigned flow. BCG and McKinsey data, aggregated by Duperrin in 2025, show that high performers are three times more likely to fundamentally redesign workflows rather than accelerate existing ones.
Role load is the component most visibly broken. Gallup's January 2026 research on span of control documents a striking trend: the average number of direct reports per manager increased from 10.9 in 2024 to 12.1 in 2025 — a nearly 50% increase since Gallup first measured the metric in 2013. Simultaneously, 97% of managers carry individual contributor responsibilities alongside their management role, with the median manager spending 40% of their time on non-managerial work. Gallup's analysis is direct: managers who exceed the 40% individual contributor threshold show lower engagement, and that engagement declines further as team size increases.
The managerial layer is not bloated. It is overloaded and misassigned. Organizations have reduced management headcount and widened spans of control without removing the administrative and coordination tasks that filled the original, narrower spans. The manager is now expected to do more management with more people while making more individual contributions. That arithmetic does not work. What it produces is a population of people who are nominally responsible for coherence, development, and translation across organizational layers — and who do not have the capacity to do any of those things at a quality level.
Network health is the least managed and perhaps most consequential of the four. An organization's innovation and problem-solving capacity does not reside in its processes or org chart. It lives in the density and diversity of connections across the network — and specifically in the weak ties that carry novel information across functional boundaries. The MIT research in Nature Computational Science established the mechanism clearly: co-location produces weak ties through serendipitous proximity; remote work eliminates those encounters; and the loss is not random — it disproportionately erodes the cross-unit, cross-functional connections that drive innovation while preserving the within-team connections that reinforce existing thinking.
Deloitte's organizational network analysis confirms that informal influencers — people highly connected across the network, regardless of formal title — are often better predictors of organizational performance than those on the formal leadership team. Organizations that do not actively map and manage network health are, in effect, hoping the network maintains itself.
The AI Signal Is Diagnostic, Not the Problem
A common executive response to coordination breakdowns in AI-enabled environments is to treat them as a technology problem — imposing governance frameworks, standardizing tool stacks, or slowing AI adoption until the organization catches up. This misidentifies the cause.
AI does not create a coordination breakdown. It reveals it. When AI tools proliferate in an environment with a weak decision architecture, parallel conflicting decisions accelerate. When AI accelerates individual productivity in organizations with misaligned workflows, downstream bottlenecks sharpen. When AI enables autonomous local action in environments with limited network connectivity, innovation debt compounds more quickly.
The DORA Report 2025's summary is precise: "AI magnifies the strengths of high-performing organizations and the dysfunctions of struggling ones." The organizations responding to AI-enabled dysfunction by restricting AI are treating the amplifier as the source. The actual intervention is redesigning the underlying system that AI is amplifying.
Clarity Decay Is Not a Communication Problem
Gallup's 2024 data shows that only 47% of employees know what is expected of them at work — a number that has not meaningfully improved in a decade of investment in engagement programs, performance management platforms, and communication tools. The standard interpretation is that managers need to communicate more clearly.
The more precise interpretation is that clarity cannot be communicated into a system without a decision architecture. Employees do not lack information about their roles because managers are inarticulate. They lack it because the decision rights upstream of their roles are genuinely ambiguous, and no amount of better communication resolves structural ambiguity. Gallup's own 2025 meta-analysis quantifies what clarity actually delivers when it exists: improving clarity of expectations from current levels to best-practice levels drives a 9% increase in profitability and an 11% improvement in work quality. That is not a communication ROI. It is a structural redesign ROI.
Where This Argument Gets Complicated
The most serious objection is practical: operating model redesign is expensive, disruptive, and organizationally consuming. Even the improved 2025 success rate of 63% means more than one-third of redesigns fail to meet most objectives. Organizations in execution mode — managing a transformation, absorbing a recent reorg, facing market pressure — do not have the capacity for a comprehensive intervention. If formal redesign attempts fail at that rate, the prescription to "redesign the operating model" may create more turbulence than it resolves.
This is a real constraint. But McKinsey's data shows that failure rates correlate strongly with scope, not concept. Organizations that attempt to redesign all 12 operating model elements simultaneously fail far more often than those that target the two or three elements most misaligned with strategy. Decision architecture and workflow mapping can be addressed at the team or function level without enterprise-wide restructuring. This is not a program. It is a practice.
The cost of inaction is also real — it simply does not appear on a budget line. Coordination overhead, rework from conflicting AI-accelerated decisions, and innovation loss from eroded networks show up as slower cycle times, strategy drift, and a persistent sense that the organization is working harder than it should be for results that feel underwhelming. That is not a culture problem. It is an operating model tax. It compounds quietly.
Implications for Leaders
Audit your decision architecture before your next planning cycle. Identify the 15 to 20 recurring decisions that actually govern execution — not the big strategic calls, but the operational ones: who approves a vendor above $X, who owns the product roadmap exception, who adjudicates a resourcing conflict. Name a single owner for each, define the escalation threshold, and make both visible in the tools your teams actually use. This is a half-day exercise with a leadership team. Its absence costs weeks of coordination overhead per quarter.
Map one critical workflow end-to-end — the actual one, not the documented one. Pick the workflow causing the most visible friction. Trace every step, handoff, approval, and workaround between intent and outcome. Note where AI tools have been inserted and where they have created downstream bottlenecks. The goal is not a process diagram. It is a diagnostic that surfaces where the formal model and the real one have diverged most dangerously.
Treat the 40% individual contributor threshold as a structural alarm, not a management style issue. Gallup's data are unambiguous: managers who spend more than 40% of their time on non-managerial work show declining engagement regardless of talent, and that decline worsens as the span of control increases. If your managers are routinely above that threshold, you have not reduced management headcount. You have transferred its costs onto the remaining managers and degraded the organizational functions — development, coherence, cross-functional translation — they were supposed to provide.
Make network health a visible metric for senior leaders. Collaboration metadata — from tools like Slack, Teams, and GitHub — provides enough signal to identify where cross-functional weak ties are thinning without requiring a formal network survey. Which functions are not communicating with each other? Where are the structural holes — the gaps in the network where novel ideas cannot flow? Senior leaders who are not actively cultivating cross-functional connections are not just missing a soft skill. They are allowing an innovation-and-coordination asset to depreciate.
Evaluate AI deployment against the organizational system, not the individual use case. Before scaling an AI tool or workflow, ask: what happens downstream when this node accelerates? Who absorbs the increased output? Does that absorptive capacity exist? The AI Super Productivity Paradox is not resolved by slowing AI adoption. It is resolved by designing the receiving system before accelerating the sending one. High performers redesign the workflow first; then they deploy the technology.
The Bottom Line
The most disorienting thing about the gap between the formal operating model and the real one is that it produces no visible crisis. Decisions get made. Work gets done. Revenue comes in. The model looks fine from a sufficient distance.
What the distance obscures is the ongoing tax: the coordination overhead absorbed by every meeting that resolves what the org chart should have prevented, the innovation that does not happen because the network that would have produced it has gone quiet, the strategy that sits correct and unexecuted because the decision architecture underneath it belongs to a different organization than the one that wrote it.
Most executives believe they are managing their company. In practice, they are managing the friction produced by a system that no longer fits the work. The real job of leadership — the one that does not appear on any performance review — is redesigning the system so that better decisions become not just possible, but structurally inevitable.
Sources
McKinsey & Company. "The New Rules for Getting Your Operating Model Redesign Right." June 2025. https://www.mckinsey.com/capabilities/people-and-organizational-performance/our-insights/the-new-rules-for-getting-your-operating-model-redesign-right
McKinsey & Company. "A New Operating Model for a New World." June 2025. https://www.mckinsey.com/capabilities/people-and-organizational-performance/our-insights/a-new-operating-model-for-a-new-world
Gallup; Jim Harter. "Span of Control: What's the Optimal Team Size for Managers?" January 2026. https://www.gallup.com/workplace/700718/span-control-optimal-team-size-managers.aspx
Gallup. "The Great Detachment: Why Employees Feel Stuck." 2025. https://www.gallup.com/workplace/653711/great-detachment-why-employees-feel-stuck.aspx
Santi, P. et al. "The Effect of Co-location on Human Communication Networks." Nature Computational Science, August 2022. https://www.nature.com/articles/s43588-022-00296-z
MIT News. "Analysis of Email Traffic Suggests Remote Work May Stifle Innovation." September 2022. https://news.mit.edu/2022/remote-work-may-innovation-0901
Asana. "The AI Super Productivity Paradox." 2025. https://asana.com/resources/ai-super-productivity-paradox
Faros AI. "Key Takeaways from the DORA Report 2025: AI Impact on Dev Metrics." September 2025. https://www.faros.ai/blog/key-takeaways-from-the-dora-report-2025
California Management Review. "AI Productivity Blind Spot." January 2026. https://cmr.berkeley.edu/2026/01/ai-productivity-blind-spot/
Microsoft. "Great Expectations: Making Hybrid Work Work — Work Trend Index." 2022. https://www.microsoft.com/en-us/worklab/work-trend-index/great-expectations-making-hybrid-work-work
Deloitte. "Organization Network Analysis: Harnessing the Power of Networks." 2024. https://www.deloitte.com/us/en/services/consulting/blogs/human-capital/harnessing-organization-network-analysis.html
Duperrin, B. "The Impact of AI in Business: What the Reports Show." December 2025. https://www.duperrin.com/english/2025/12/08/impacy-ai-transformation-bcg-mckinsey/