In a viral essay about X, “Something big is happening,” Matt Shumer writes that the world is experiencing a moment similar to the beginning of Covid for artificial intelligence. The founder and CEO of OthersideAI stated that AI has gone from being a useful assistant to a general cognitive substitute. What’s more, AI is now helping to build better versions of itself. Systems that rival most human experience could soon arrive.
While experts know that transformative change is coming quickly, regulations are about to be surprised. Continuing with the pandemic-era metaphor, Tom Hanks is about to get sick.
Between Shumer’s essay and Mrinank Sharma’s resignation (he led Anthropic’s security team and loosely published quite The farewell letter warns that “the world is in danger” due to “interconnected crises,” while hinting that the company “constantly faces[s] “pressures to let go of what matters most” even as it chases a $350 billion valuation…well…some people are starting to lose it. Or, more precisely, people who were already super worried about AI are now super worried even more.
Look, is it possible that AI models will soon unquestionably meet several so-called weak AGI definitions, at the very least? Many technologists, not to mention prediction markets, suggest yes. (As a reality check, however, I keep in mind Google DeepMind CEO Demis Hassabis’ statement that we still need one or two AlphaGo-level technological breakthroughs to reach AGI.)
But instead of technological advances (and I’m very confident that generative AI is a powerful general-purpose technology), let’s talk about some basic bottlenecks and limitations in the world of economics rather than computing.
The long road from demonstration to implementation. The jump from “AI models are impressive, even more so than you think” to “everything is changing imminently” requires ignoring how economies actually absorb new technologies. Electrification took decades to redesign factories. The Internet didn’t change retail overnight. AI adoption currently covers fewer than one in five U.S. commercial establishments. Implementing it at large, regulated, risk-averse institutions requires a large complementary investment in data infrastructure, process redesign, compliance frameworks, and worker training. (Economists call this the productivity J-curve.) In fact, early-stage spending can actually depress measured output before visible gains arrive.
Richer Doesn’t Always Mean Busier. Let’s grant the optimists (and I certainly consider myself quite optimistic) their assumption about the rapid advancement of AI capabilities. Production still hasn’t exploded on a dime. Historically, wealthier societies choose more leisure (earlier retirements, shorter work weeks) and not more time in the office or factory. Economist Dietrich Vollrath has pointed out that higher productivity does not automatically translate into faster growth if households respond by providing less labor. Welfare could increase substantially while overall GDP growth remains relatively modest.
The slowest sector marks the speed limit. Even if AI makes some services much cheaper, demand does not expand unlimitedly. Spending shifts toward sectors that resist automation (healthcare, education, in-person experiences) where production is more closely tied to human time. (This is the famous “Baumol effect” or “cost disease”). As wages rise across the economy, labor-intensive sectors with weak productivity growth claim a larger share of income. The result: Even spectacular advances in AI can generate only moderate growth in overall productivity.
The narrowest tube in the economy. In a system built from many complementary parts, explains economist Charles Jones, the narrowest tube determines the flow. AI can speed up coding, writing, and research as much as it wants. But if energy infrastructure, physical capital, regulatory approval or human decision-making move at normal speeds, those become binding constraints that limit how quickly the entire economy can grow.
Economies are adaptable, complex and wonderful systems. They create physical objects that embody and accumulate complex information, what economist César Hidalgo elegantly calls “imagination crystals.” And when they change, they adjust through gradual reorganization and reallocation, not through a sudden collapse or instantaneous takeoff. I mean, that should be your base case.
Now, a certain degree of urgency may be justified. (Shumer’s advice to embrace the most powerful AI tools now and integrate them into your daily work seems prudent.) Panic-inducing analogies to early 2020 probably aren’t.
This article originally appeared in the Pethokoukis newsletter “Faster Please!”

