1. AI is creating cognitive overload
We usually think AI takes on the heavy mental lifting. But while AI is increasing output, it’s also overloading how we think and make decisions. People are doing more—but thinking less critically—and burning out trying to keep up.
- A Boston Consulting Group study found heavy AI users reported “brain fry” and reduced engagement in critical thinking, suggesting productivity gains may come with cognitive fatigue tradeoffs (AI Productivity).
- AI continues to grow and scale, but not necessarily in a way that aligns to how people need it to. Adoption is rising, but trust is lagging: surveys show more Americans are using AI tools, while fewer say they trust the results (TechCrunch).
- A University of Pennsylvania study found users often follow incorrect AI advice, even when it contradicts their own expertise. (AI Productivity, TechCrunch).
This creates a paradox: People are expected to use AI more, even as it increases cognitive load, and many don’t fully trust or question the outputs.
Whether it’s AI’s confidence or the speed we’re moving, this introduces real risk as misjudgment scales and accuracy takes a back seat to speed.
2. The human role is strategy
In the short term, AI adoption may reduce some roles (we’re already seeing layoffs). But it’s also reshaping work—pushing humans toward higher-value functions like judgment, orchestration, and problem framing.
- End-to-end automation for software production is getting closer—a concept called the "dark factory" where people aren't involved (Business Insider). Simon Willison reports that ~95% of his code is now AI-generated, illustrating how engineering roles are shifting from writing to reviewing and orchestrating (Business Insider).
- While some companies are reducing headcount tied to automation, others (including major AI labs) are simultaneously expanding technical hiring, especially in infrastructure and applied AI roles. NVIDIA's Jensen Huang challenges the need for layoffs: "For companies with imagination, you will do more with more. For companies where the leadership is just out of ideas, they have nothing else to do, they have no reason to imagine greater than they are then when they have more capability, they don't do more." (Yahoo Finance, Fortune).
- About 15% of Americans say they would be willing to work for an AI boss, signaling early normalization of AI in management roles (TechCrunch). This shift could flatten traditional organizational structures, reducing layers of middle management (Fast Company).
- The number of AI agents is growing faster than humans can effectively manage them, creating a new bottleneck in orchestration and oversight (Forbes).
To keep up, how we work needs to change. Less doing, more directing. Less execution, more orchestration.
3. Investment capital is increasing for AI companies
Investment in AI is growing fast—but it’s growing faster than companies can turn it into real results.
- OpenAI has raised roughly $12 billion in new funding, totalling ~$122 billion round (New York Times, TechCrunch). The company’s valuation is widely reported in the $100B–$120B+ range, underscoring how capital-intensive the AI race has become.
- AI seed startups, increasingly with solo founders and AI agent supported, are getting high valuations earlier with more capital coming in at earlier stages (TechCrunch). One founder reportedly used just $20,000 in initial capital to build a company now valued at $1.8 billion, highlighting how AI is collapsing the cost of company creation (Forbes).
This looks familiar. Like the dot-com era, capital is chasing potential over proven outcomes. And like the housing bubble, complexity is making it harder to see where the real risk is.
The likely outcome isn’t the end of AI, but a correction. And when that happens, only companies with real value and differentiation will last.
4. AI's infrastructure demands are a major constraint
AI isn’t just a software problem anymore—it’s an infrastructure one.
- Oracle announced layoffs via email to thousands of employees as part of a broader shift in spending toward AI infrastructure and data center investment (CNBC).
- The infrastructure required to support AI—training models, running inference, and cooling data centers—is driving significant increases in energy demand. Greenpeace claims AI could significantly increase global energy consumption, and research conducted globally flags the intensive computing and cooling resources required can have long-term impacts on how to power it. Data centers already account for a growing share of electricity usage, with projections suggesting significant increases by 2030 as AI demand scales (ScienceDaily, Greenpeace).
Scaling AI means building more data centers, energy capacity, physical infrastructure.
And that raises bigger questions around energy, cost, land use, and long-term impact.
AI is scaling faster than people can process, manage, and adapt to it—and it’s not slowing down.
So the real question is: How do we keep up? What’s the human impact? And can we scale with it?