Essay 7 of 7

The New Steady State

The disruption doesn't end. Hypercompetition isn't a phase to survive — it's the permanent condition of an intelligence-abundant economy.

16 min read

Every major technology wave in the past fifty years has followed the same competitive arc: disruption, turbulence, consolidation, equilibrium.

The internet restructured media. Newspaper classified revenue peaked near $49 billion in 2000 and fell to $4.6 billion by 2012. Thousands of digital properties launched and competed. Then the dust settled. Google and Meta consolidated more than half of US digital ad spending by the late 2010s, reaching a combined peak somewhere north of 54% of the market. The chaos of the early web resolved into a recognizable oligopoly. The disruption was real, but it was finite.

Mobile followed the same pattern. Nokia’s market capitalization collapsed from $150 billion to a $7.2 billion fire sale. BlackBerry and Palm disappeared. Hundreds of smartphone competitors entered the market. Then: Apple and Android. By 2020, the two platforms controlled more than 99% of global smartphone operating system market share. The UK’s Competition and Markets Authority described it directly — an “effective duopoly.” Intense competition at the start. A clear equilibrium at the end.

Cloud computing took longer but arrived at the same destination. AWS launched in 2006. For a decade, dozens of cloud providers competed. By 2025, three providers — AWS, Azure, and Google Cloud — held roughly 63% of global cloud infrastructure spending. The market consolidated. Incumbents found positions. The disruption produced a new equilibrium, and the equilibrium held.

This is not a coincidence. It is a well-documented pattern in the academic literature. Michael Tushman and Philip Anderson described it in 1986 as punctuated equilibrium in technological change: long periods of incremental improvement interrupted by brief episodes of radical change, after which a new stable order forms. Competence-destroying discontinuities arrive, wreak havoc, and then, critically, stop. Survivors consolidate. New dominant designs emerge. The system stabilizes.

This pattern is deeply embedded in how the technology industry thinks about change. The consulting firms sell disruption indices on the assumption that disruption is a phase, not a condition. Executives watch other sectors get restructured and believe they have time to prepare. “AI disrupts X” became the default frame because every prior technology wave ended in a new steady state.

The pattern suggests AI should follow the same arc. It will not.

The data was moving before AI arrived

The evidence that competitive advantage was becoming harder to sustain predates AI entirely.

Innosight’s analysis of S&P 500 tenure tells the story cleanly: the average company’s lifespan on the index declined from 61 years in 1958 to 33 years by 1964, to 24 years by 2016. Their 2021 projection: 12 years by 2027. Of the original 1955 Fortune 500, roughly fifty companies remain. A 90% extinction rate over seven decades.

Robert Wiggins and Timothy Ruefli quantified the dynamic in their 2005 Strategic Management Journal study, “Schumpeter’s Ghost.” Analyzing thousands of firms across decades, they found that fewer than 5% sustained superior economic performance for ten years or longer. Competitive advantage had become, in their words, “significantly harder to sustain.” This held across a broad range of industries, not just technology. The best-of-times periods were getting shorter.

Liam Thomas and Richard D’Aveni extended the analysis in a 2009 study published in Strategic Organization, examining US manufacturing from 1950 to 2002. They documented “sharply increased within-industry heterogeneity of returns,” meaning the spread between winners and losers within the same industry was widening, and identified “a broad, monotonic shift towards a new, more dynamic form of competition.” Not in one sector. Across manufacturing as a whole. Over half a century.

This is the pre-existing condition. Competitive advantage was already eroding before GPT-2 existed. The punctuated equilibrium model was already breaking down. Each technology wave since the 1950s has produced shorter periods of stability, faster competitive entry, and less durable market positions.

Why intelligence on tap breaks the cycle

But each prior wave still produced a new equilibrium — just a shorter one. The internet disrupted media, and Google consolidated. Mobile disrupted phones, and Apple stabilized. Cloud disrupted hosting, and AWS endured.

The equilibria were shorter, the turbulence more intense, but the fundamental pattern held. Disruption was followed by consolidation. Turbulence resolved into order.

Intelligence on tap breaks this cycle for a specific, structural reason: the thing becoming cheap is not a particular technology applied to a particular industry. It is the input used to find the next competitive advantage.

When containerized shipping made logistics cheap, manufacturers who relied on geographic protection were exposed. But the process of finding new advantages still required expensive human cognition. Analyzing markets, designing products, identifying strategic positioning: that work took time and talent. Firms could regroup and rebuild because strategy itself was a scarce resource.

When the internet made distribution free, media companies that relied on distribution monopolies collapsed. But figuring out what came next still required human intelligence operating at human speed. Google had time to consolidate because the next competitive response took years to organize.

AI commoditizes the thinking itself. The analysis, the strategy, the identification of friction, the design of competitive responses: this is what intelligence on tap provides at near-zero marginal cost. The interval between one competitive advantage eroding and the next one being attacked compresses toward zero, not because the technology moves faster (though it does), but because the cognitive work of finding the next move is no longer scarce.

This is the structural distinction that matters. The HFT analogy holds not as metaphor but as mechanism. Traditional financial markets had disruptions followed by stabilization because the process of identifying and exploiting new inefficiencies required human analysis, human judgment, human time. HFT automated the search for the next spread, and the result was not a new equilibrium but continuous competition with no stable state.

AI does the same thing to the broader economy. It lowers barriers to entry in specific markets, yes. But more than that, it automates the process by which competitors identify, attack, and erode whatever advantages remain. The cognitive work that used to buy incumbents time is exactly the work that intelligence on tap makes cheap: analyzing threats, formulating responses, redesigning strategies.

The cycle of disruption followed by equilibrium depended on a bottleneck: human cognition operating at human speed. Remove the bottleneck, and the cycle collapses into something new. Not disruption followed by stability. Not punctuated equilibrium with shorter intervals.

Continuous competition. No stable state. The dust does not settle.


If hypercompetition compresses margins wherever cognitive work meets digital output, an obvious question follows: where does economic value actually accumulate?

The structure has already formed. It looks familiar.

The infrastructure oligopoly

The PC revolution produced hypercompetition among software developers — thousands of companies building applications, most failing, margins under constant pressure. But the infrastructure layer consolidated into the Intel-Microsoft duopoly, which captured the majority of the industry’s profits for two decades. In 2000, Microsoft’s operating margin was 52%. The average application developer’s was in the single digits.

The internet produced hypercompetition among web services — millions of sites, brutal pricing pressure, entire categories commoditized. But the infrastructure consolidated into the cloud oligopoly. AWS, Azure, and Google Cloud now hold roughly 63% of a global cloud infrastructure market that generated $419 billion in revenue in 2025. The platforms everyone built on captured more value than the things built on them.

AI is producing the same structure, at larger scale.

The five largest AI infrastructure investors — Microsoft, Alphabet, Amazon, Meta, and Oracle — committed approximately $700 billion in capital expenditure for 2026, nearly doubling their 2025 spending. Roughly three-quarters of that is going directly to AI infrastructure: GPUs, data centers, cooling systems, power generation. Nvidia holds over 86% of the AI accelerator market and reported $51.2 billion in data center revenue in a single quarter in late 2025. Not all of this is infrastructure-as-a-service in the way AWS was — Meta’s 350,000-plus H100 GPUs serve its own products, not a platform for others — but the capital concentration follows the same gravity. The companies that can spend billions on physical infrastructure are pulling away from the companies that cannot.

At the application layer, the picture inverts. An estimated 15,000 to 25,000 AI wrapper applications exist globally, with roughly 375 new ones launching every month. Over 90% fail within their first year. Jasper AI reached a $1.5 billion valuation in October 2022, peaked at $120 million in annual revenue, then watched revenue fall by more than half after ChatGPT entered its market directly. The arc — from billion-dollar valuation to existential crisis — played out in about two years.

This is the same structural dynamic that played out in financial markets after HFT arrived. Competition compressed market-making margins to near zero while the infrastructure providers — exchanges, data centers, connectivity firms — captured the value. Citadel Securities alone now executes roughly a quarter of all US equity volume. The rents moved from the application of intelligence to the infrastructure that intelligence requires.

The Bruegel research group calls this a bifurcation in contestability. The infrastructure layer is becoming more concentrated — Meta’s AI buildout alone cost north of $10 billion — while the application layer is becoming radically more contestable. The barriers at the infrastructure level are the ones intelligence on tap does not erode: physical hardware, energy, land, cooling, the capital to operate at training-cost scale. The barriers at the application level are the ones it does erode: knowledge, complexity, speed of development.

One layer consolidates. The other fragments. Value migrates toward the layer where the barriers are made of atoms rather than bits.

The shrinking window of capture

The second dimension of value migration is temporal.

William Nordhaus, in a 2004 NBER study of US economic performance from 1948 to 2001, estimated that innovators capture approximately 2.2% of the total social surplus their innovations create. The remaining 97.8% flows to consumers as lower prices and better products.

The mechanism behind that number matters. Nordhaus estimated an initial appropriability rate of about 7% — the share of total social surplus the innovator can capture at the moment of introduction — which then depreciates at roughly 20% annually as competitors imitate, patents expire, and superior alternatives emerge. The cumulative present value of the innovator’s profit stream, relative to the total social value of the innovation, works out to that 2.2%.

In a hypercompetitive environment, the depreciation rate accelerates. Imitation happens faster. Competitive entry happens sooner. The window between “this is ours” and “this is available to everyone” compresses. If Nordhaus’s 20% annual depreciation was the rate when imitation took years and competitive entry required building organizations from scratch, what is the rate when Hugging Face can reproduce core functionality of OpenAI’s Deep Research in 24 hours? When a product category can be flooded with thousands of competitors building on the same foundation models within a single year?

The math is directional, not precise. But the direction is unambiguous: the innovator’s share of total surplus falls as the speed of competitive entry rises. Consumers capture more. Producers capture less. The window of profit extraction shrinks with each cycle.

This creates an apparent paradox. Kenneth Arrow showed in 1962 that competitive firms have greater incentive to innovate than monopolists — what economists call the replacement effect. A monopolist’s gain from innovation is merely the incremental improvement over profits it already earns. A competitive firm goes from zero economic profit to the full value of the innovation, at least temporarily. Joseph Schumpeter argued the opposite: innovation requires the expectation of temporary monopoly profit as reward.

In a hypercompetitive world, both dynamics operate simultaneously and neither resolves. Arrow’s incentive intensifies — more firms entering, more firms innovating, each one hunting the next friction because they have nothing to protect. Schumpeter’s reward shrinks — the temporary monopoly gets shorter, the profit window narrows, the rent erodes before it consolidates. More innovation, faster, with less capture by any single innovator.

This is not a market failure. It may be what a functioning economy looks like when intelligence is cheap.

The consumer windfall

The destination is not abstract. It is lower prices.

LLM inference costs have fallen at a median rate of 50x per year, and over 200x per year since January 2024. Customer support that cost $5.60 per interaction with a human agent costs roughly $0.20 with an AI system — a 96% reduction. Legal document review that required an associate’s afternoon can be done in minutes. The premium that SaaS companies once charged for knowledge-intensive features erodes when a small team with API access can rebuild those features from scratch.

The Nordhaus finding, from an era when competitive entry was slower and imitation was harder, already showed consumers capturing 97.8% of innovation’s surplus. In an era when competitive entry is measured in days rather than years, that share can only go up.

This is a genuine windfall for consumers and a genuine compression for producers. The total surplus grows — more innovation, more products, more capability, lower costs. But the distribution shifts further toward the consumer side. Producers compete the value away before they can capture it. The infrastructure layer takes its share. Everyone else operates in the space that remains.

The economy produces more value in this structure, not less. But it accumulates in two places: at the infrastructure layer, where physical barriers create durable positions, and at the consumer layer, where competition delivers the surplus as lower prices and better products. The application layer — where most companies operate, where most startups are built, where most economic activity happens — becomes the contested middle. Not valueless. Not unprofitable in every case. But structurally unable to sustain the margins or the market positions that the previous era made possible.


A few things this argument is not.

It is not claiming that all companies will fail, that profit is impossible, or that competition eliminates all advantage in all markets at all times. Plenty of firms will thrive. Some will thrive for a long time.

What it is claiming: the duration of advantage is compressing, the intensity of competition is increasing, and the stability of market positions is declining — not as a phase the economy is passing through, but as the way the economy now works. The evidence spans decades of data across industries well beyond technology. The mechanism is specific: when the cognitive work of finding, attacking, and eroding competitive positions becomes cheap, the interval between building an advantage and losing it collapses.

The Aghion question

The previous section described the paradox at the center of a hypercompetitive economy: Arrow’s incentive drives more firms to innovate, Schumpeter’s reward shrinks with each cycle, and the innovator’s window of profit capture compresses. That raises a question the economic literature has been wrestling with since at least 2005, when Philippe Aghion and his co-authors published an empirical finding that complicates both Arrow and Schumpeter.

The relationship between competition and innovation is not a straight line. It is an inverted U.

Innovation rises with competitive intensity up to a point, then falls as the prospect of reward becomes too small to justify the investment. Where an economy sits on that curve depends on who is doing the competing. Neck-and-neck rivals innovate aggressively to escape each other — each one trying to pull ahead, because parity means zero profit. Laggards, by contrast, reduce investment because the leaders are too far ahead for innovation to close the gap within the shrinking profit window.

The ratio of neck-and-neck competitors to laggards determines which side of the curve dominates. More neck-and-neck competition pushes toward the ascending side — more innovation. More lopsided industries push toward the descending side — less.

So where does a hypercompetitive, intelligence-on-tap economy land?

The funding data tells one story. AI startup investment reached $203 to $211 billion in 2025 — roughly half of all global venture capital. Seventeen US-based AI companies raised $100 million or more in the first seven weeks of 2026 alone. Innovation is not slowing.

But the capture data tells another. Jasper went from a $1.5 billion valuation to revenue collapse in under two years. Hugging Face reproduced core functionality of a frontier research product in 24 hours. Product categories flood with competitors building on identical foundation models before the first entrant has finished its second sales cycle.

Both stories are true simultaneously, and that is the point. More innovation, faster, with less capture by any single innovator. The ascending side of Aghion’s curve and the compressing profit window coexist — because intelligence on tap lowers the cost of attempting innovation so far that even a shrinking expected reward justifies the bet. When it costs $500 and a weekend to build what used to require a team of twelve and six months of runway, the threshold for “worth trying” drops through the floor.

The Nordhaus finding from the previous section — innovators capturing roughly 2.2% of the total surplus their innovations create — may turn out to be less a measurement of a specific era and more a description of what competitive markets converge toward. As imitation accelerates from years to days, that share falls further. The total surplus grows. Consumers capture more. Producers capture less per attempt, but there are many more attempts.

What survives

If sustainable competitive advantage is the wrong frame, what replaces it?

Speed. Not a moat — a muscle.

John Boyd, a US Air Force colonel, spent decades studying why some competitors survive continuous pressure and others do not. His framework came from an unlikely place: the Korean War. American pilots flying F-86 Sabres consistently outperformed opponents in MiG-15s that were, on paper, the better aircraft — faster in certain regimes, tighter turning radius at altitude. The F-86’s advantages were not about raw performance. A bubble canopy gave the pilot wider vision. Hydraulic controls enabled quicker transitions between maneuvers. The pilots who won were the ones who could see, reorient, decide, and act faster than their opponents could finish a single decision cycle.

Boyd formalized this as the OODA loop — observe, orient, decide, act — and the insight applies well beyond dogfighting. In any competitive environment under continuous pressure, the entity that cycles faster forces its opponents into perpetual reorientation. They are always responding to the last move while you are already executing the next one.

Richard D’Aveni argued that the only sustainable advantage is the ability to string together temporary advantages — exploiting one position while already building the next. Rita McGrath called it transient advantage. Wiggins and Ruefli found it empirically: the firms that sustained superior performance did so not by defending a single position but by concatenating a sequence of short-lived ones, each exploited and abandoned before competitors could fully respond.

Jay Barney and Martin Reeves put it directly in the Harvard Business Review in September 2024: “Gen AI will be more likely to remove a competitive advantage than to confer one.” When every competitor has access to the same cheap intelligence, cost advantages get matched simultaneously, differentiated features get cloned before the first sales cycle closes, and first-mover advantages do not last long enough to justify the cost of moving first.

The organizational capability that matters is not any particular advantage. It is the ability to find the next one, exploit it, and move on — faster than the field.

The closing loop

This series opened with a claim: “AI disrupts X” is the wrong question.

The consulting firms publish their disruption indices each January, ranking industries by vulnerability, offering a sequence. Travel first, then retail, then legal, then whatever comes next. The framing assumes disruption is a phase. That it happens to specific industries in a specific order. That after the restructuring, a new equilibrium forms and the system stabilizes. Every prior technology wave supported that assumption. The internet restructured media, and then Google consolidated. Mobile restructured phones, and then Apple stabilized. Cloud restructured hosting, and then AWS endured.

Six essays of evidence suggest otherwise.

Intelligence on tap does not produce a new equilibrium. It produces an economy in which the search for the next advantage, the erosion of the current one, and the entry of new competitors happen simultaneously and continuously. The infrastructure layer consolidates — it always does, when the barriers are made of atoms. But the application layer, where most companies operate and most economic activity happens, becomes contested ground where margins compress, positions shift, and the interval between building something and watching it replicated approaches the time it takes to build it.

Hypercompetition is not the transition. It is the steady state.

The question was never which industry AI restructures next. It was always: what happens when the input to all competitive activity becomes cheap?

Now the answer is visible. Not a disruption that resolves. Not an equilibrium that forms. Something the economy has not operated under before — and the beginning, not the end, of what intelligence on tap will do to competitive dynamics.

The dust does not settle.