Don’t blame it on the moonlight, blame it on the AI
Why AI is forcing a new standard of accountability and performance.

Dave Garcia
Founder and Co-CEO
Apr 27, 2026
Over the past year, a new narrative has taken hold across headlines, LinkedIn posts, and boardroom conversations: companies are firing people and replacing them with AI. The stories are compelling because they are simple, and they travel fast because they tap into a very real anxiety. But like most simplified narratives, they collapse under closer inspection. What we are witnessing is not a wave of companies cleanly substituting humans with machines, but a much more structural shift in how work is produced, measured, and justified.
The headlines vs. the underlying reality
A number of high-profile cases have fueled this perception. Companies across tech, customer support, and content production have announced layoffs while simultaneously highlighting their investments in AI.
In some cases, executives have explicitly stated that automation has allowed them to operate with leaner teams. In others, the connection is more implicit, but quickly amplified by media and social channels into a direct cause-and-effect story. The result is a distorted but powerful idea: AI comes in, people go out. It is a neat explanation, but it misses what is actually changing inside these organizations.
What AI is replacing today is not people in a holistic sense, but specific categories of work that were previously invisible, tolerated, or simply too expensive to optimize. Routine execution, boilerplate creation, coordination overhead, and parts of technical implementation are increasingly handled by systems that can operate faster and at lower marginal cost. This does not eliminate roles overnight, but it does change the economic logic of how those roles are structured. When the same output can be produced with fewer resources, the question is no longer whether a team can deliver, but whether it is delivering efficiently relative to what is now possible.
The misinterpretation of “AI layoffs”
This is where many of the so-called “AI layoffs” need to be reframed. Most organizations making these decisions are not reacting purely to technological capability, but to accumulated inefficiencies that have become harder to justify. Over-hiring during growth cycles, fragmented workflows, unclear ownership, and weak alignment between product and engineering have existed for years. AI does not create these issues, but it removes the buffer that allowed them to persist. Once a portion of the work becomes faster and cheaper, the rest of the system is exposed. The layoff is not the direct output of AI, but the consequence of a system that no longer holds under new constraints.
That’s where most of the attention goes. It’s easier to write headlines about layoffs than to notice the far more interesting shift happening underneath: AI isn’t just changing who gets hired, it’s fundamentally changing what scale looks like.
From outsourcing scale to elite teams with AI
One of the most under-discussed shifts is happening in the economics of outsourcing. For years, large organizations relied on scale as the primary lever: more people, distributed across geographies, handling well-defined pieces of work. Outsourcing worked because coordination costs were lower than the cost of building highly efficient internal systems, and because labor arbitrage created clear financial incentives.
AI is breaking that equation. When a small, highly capable team augmented with AI can produce the same, or better, output than a large distributed team, scale stops being the advantage. Coordination overhead, handoffs, and communication gaps, and specially knowledge sharing which were once accepted as the price of scale, become liabilities. The result is a gradual but meaningful shift from large outsourcing structures to smaller, elite teams that operate with significantly higher leverage.
This does not mean outsourcing disappears, but it does mean its value proposition is being redefined. The winning model is no longer about providing more hands at a lower cost, but about delivering higher-quality outcomes with tighter feedback loops and stronger ownership. In that context, AI is not just a productivity tool; it is a force that compresses organizational complexity.
The accountability shift
AI changes the baseline: If output can increase materially with the same or fewer people, then variation in performance across teams becomes impossible to ignore. The conversation moves from activity to outcomes, from effort to impact, and from narrative to evidence.
This is why the most relevant question is not whether AI can replace a given role, but why that role required a certain structure to begin with. In many cases, the answer has little to do with human capability and much to do with how work was organized. Poorly defined requirements, late collaboration between teams, excessive rework, and lack of clear ownership create artificial demand for headcount. When AI compresses parts of the execution layer, these inefficiencies become disproportionately visible. What used to look like necessary capacity starts to look like structural waste.
What will actually differentiate companies
The organizations that will come out ahead are those that treat AI as a forcing function to improve clarity. Clarity on what is being built, clarity on how teams collaborate, and clarity on what outcomes are expected for every unit of investment. In that context, AI is not a substitute for people, but a catalyst that raises the standard for how people operate. It rewards well-structured systems and exposes weak ones. It makes good teams better and average teams harder to justify.
So while it is tempting to attribute every layoff announcement to AI, the reality is more demanding. AI is not simply taking jobs; it is removing the margin for inefficiency that many organizations relied on. It is also reshaping where scale lives: not in headcount, but in leverage.
The question is no longer whether work can be done, but whether it is being done in a way that makes sense under a new economic model. And that is a much harder question to answer, but also a much more important one to get right.
Over the past year, a new narrative has taken hold across headlines, LinkedIn posts, and boardroom conversations: companies are firing people and replacing them with AI. The stories are compelling because they are simple, and they travel fast because they tap into a very real anxiety. But like most simplified narratives, they collapse under closer inspection. What we are witnessing is not a wave of companies cleanly substituting humans with machines, but a much more structural shift in how work is produced, measured, and justified.
The headlines vs. the underlying reality
A number of high-profile cases have fueled this perception. Companies across tech, customer support, and content production have announced layoffs while simultaneously highlighting their investments in AI.
In some cases, executives have explicitly stated that automation has allowed them to operate with leaner teams. In others, the connection is more implicit, but quickly amplified by media and social channels into a direct cause-and-effect story. The result is a distorted but powerful idea: AI comes in, people go out. It is a neat explanation, but it misses what is actually changing inside these organizations.
What AI is replacing today is not people in a holistic sense, but specific categories of work that were previously invisible, tolerated, or simply too expensive to optimize. Routine execution, boilerplate creation, coordination overhead, and parts of technical implementation are increasingly handled by systems that can operate faster and at lower marginal cost. This does not eliminate roles overnight, but it does change the economic logic of how those roles are structured. When the same output can be produced with fewer resources, the question is no longer whether a team can deliver, but whether it is delivering efficiently relative to what is now possible.
The misinterpretation of “AI layoffs”
This is where many of the so-called “AI layoffs” need to be reframed. Most organizations making these decisions are not reacting purely to technological capability, but to accumulated inefficiencies that have become harder to justify. Over-hiring during growth cycles, fragmented workflows, unclear ownership, and weak alignment between product and engineering have existed for years. AI does not create these issues, but it removes the buffer that allowed them to persist. Once a portion of the work becomes faster and cheaper, the rest of the system is exposed. The layoff is not the direct output of AI, but the consequence of a system that no longer holds under new constraints.
That’s where most of the attention goes. It’s easier to write headlines about layoffs than to notice the far more interesting shift happening underneath: AI isn’t just changing who gets hired, it’s fundamentally changing what scale looks like.
From outsourcing scale to elite teams with AI
One of the most under-discussed shifts is happening in the economics of outsourcing. For years, large organizations relied on scale as the primary lever: more people, distributed across geographies, handling well-defined pieces of work. Outsourcing worked because coordination costs were lower than the cost of building highly efficient internal systems, and because labor arbitrage created clear financial incentives.
AI is breaking that equation. When a small, highly capable team augmented with AI can produce the same, or better, output than a large distributed team, scale stops being the advantage. Coordination overhead, handoffs, and communication gaps, and specially knowledge sharing which were once accepted as the price of scale, become liabilities. The result is a gradual but meaningful shift from large outsourcing structures to smaller, elite teams that operate with significantly higher leverage.
This does not mean outsourcing disappears, but it does mean its value proposition is being redefined. The winning model is no longer about providing more hands at a lower cost, but about delivering higher-quality outcomes with tighter feedback loops and stronger ownership. In that context, AI is not just a productivity tool; it is a force that compresses organizational complexity.
The accountability shift
AI changes the baseline: If output can increase materially with the same or fewer people, then variation in performance across teams becomes impossible to ignore. The conversation moves from activity to outcomes, from effort to impact, and from narrative to evidence.
This is why the most relevant question is not whether AI can replace a given role, but why that role required a certain structure to begin with. In many cases, the answer has little to do with human capability and much to do with how work was organized. Poorly defined requirements, late collaboration between teams, excessive rework, and lack of clear ownership create artificial demand for headcount. When AI compresses parts of the execution layer, these inefficiencies become disproportionately visible. What used to look like necessary capacity starts to look like structural waste.
What will actually differentiate companies
The organizations that will come out ahead are those that treat AI as a forcing function to improve clarity. Clarity on what is being built, clarity on how teams collaborate, and clarity on what outcomes are expected for every unit of investment. In that context, AI is not a substitute for people, but a catalyst that raises the standard for how people operate. It rewards well-structured systems and exposes weak ones. It makes good teams better and average teams harder to justify.
So while it is tempting to attribute every layoff announcement to AI, the reality is more demanding. AI is not simply taking jobs; it is removing the margin for inefficiency that many organizations relied on. It is also reshaping where scale lives: not in headcount, but in leverage.
The question is no longer whether work can be done, but whether it is being done in a way that makes sense under a new economic model. And that is a much harder question to answer, but also a much more important one to get right.

