Moon Foundry Moon Foundry
All essays
AI & Marketing

The Middle Manager Was the Algorithm

The layer being displaced was already doing what the machine does.

Alex Albano | | 7 min read

When Klarna’s CEO Sebastian Siemiatkowski told Bloomberg that the company had shrunk from roughly 5,000 employees to 3,500 through attrition and a hiring freeze, with AI absorbing the departed roles, the marketing department was one of the areas he named explicitly. Klarna’s in-house marketing team was halved. Agency spend dropped by 25%. The company reported saving $10 million annually on marketing operations, with fewer people producing more output through AI-assisted workflows. Siemiatkowski framed it as efficiency, and the numbers supported the framing. Then, months later, he publicly admitted the company had gone too far, that service and product quality had suffered, and that Klarna was hiring humans back into roles the AI had been expected to fill.

That arc stayed with me, because the pattern it contains is showing up everywhere I look, and the correction that followed is the part most AI restructuring narratives leave out. The coordination layer in marketing organisations, the brief approvals, the weekly syncs, the campaign routing, the resource allocation across content, demand gen, and product marketing, is being absorbed by agentic workflows and restructured reporting lines where individual contributors report directly to senior leadership with AI-assisted project management handling the connective tissue. The tools replace the coordination function, which turns out to be a narrower thing than anyone had mapped, and the judgment that managers developed alongside that function is left without a structural home. Klarna discovered this. Most companies running the same playbook have not discovered it yet.

The routing layer

Middle management in marketing organisations has always performed a specific structural function that is distinct from the expertise nominally associated with the role. In a typical B2B or mid-market marketing team, this function is translation and coordination: receiving strategic direction from the VP or CMO, interpreting it into campaign briefs and sprint plans, distributing work across content, demand gen, product marketing, and design, resolving conflicts over resources and priorities, aggregating performance into reports, and feeding information back up the hierarchy in a format that leadership can act on.

This function is real, and organisations that lose it without replacement experience coordination failures that are visible almost immediately. Projects drift. Priorities conflict. Information that used to flow through a human intermediary stops flowing, and the resulting gaps show up as missed deadlines, duplicated work, and strategic misalignment between what leadership wants and what teams produce.

The uncomfortable observation, the one that the current wave of AI-driven restructuring is making visible, is that this function is largely procedural. It requires judgment, but the judgment is of a specific and bounded kind: deciding which task to prioritise based on known criteria, routing information to the right people based on understood organisational structure, flagging exceptions that fall outside established parameters. These are exactly the kinds of decisions that agentic AI systems perform well, because they are pattern-matching operations performed across structured information within known constraints.

The work middle managers performed was closer to algorithmic execution than the job title, the compensation level, or the organisational narrative suggested. The layer existed because humans were the only available technology for performing coordination at scale within complex organisations. Now they are not, and the adjustment is happening faster than anyone expected precisely because the gap between what the role required and what the role was perceived to require was wider than anyone had mapped.

Institutional memory as collateral damage

The coordination function is the visible part of what middle management does, and it is the part that AI can replicate. The less visible part, the part that organisations are only beginning to recognise they have lost, is the accumulation of institutional knowledge that happened as a side effect of performing the coordination function over time.

A marketing manager who has spent three years routing campaigns, resolving cross-functional conflicts, and translating strategy into execution has, in the process, built a detailed operational model of how the organisation actually works. They know which engineers respond to which kinds of requests, which sales teams need which kinds of support, which executives care about which metrics, and where the formal organisational structure diverges from the informal power dynamics that actually determine what gets done. This knowledge is tacit, accumulated, and almost never documented, because it was never anyone’s job to document it. It accumulated as a byproduct of doing the coordination work.

When the routing layer is replaced by an agentic system, the coordination continues. The institutional knowledge does not. The AI system routes tasks efficiently based on the parameters it has been given, but it does not accumulate the contextual understanding that a human coordinator develops through years of navigating the organisation’s actual, rather than stated, operating dynamics. The result is an organisation that is operationally efficient in the near term and structurally less intelligent over time, because the mechanism through which operational wisdom was being created has been removed along with the operational function it was attached to.

This is the trade-off that most AI-driven restructuring narratives are not examining with sufficient care. The efficiency gain is real and measurable. The knowledge loss is real and difficult to measure, which means it tends to be ignored in the decision-making process and discovered retroactively, usually when the organisation encounters a situation that requires exactly the kind of contextual judgment that its departed middle managers had accumulated and that its agentic systems have no mechanism to develop.

The competence question, again

The Ada Lovelace Institute’s framework applies here with uncomfortable precision. The routing layer performed execution that looked like judgment, and the distinction between the two was invisible for as long as humans were the only available technology for performing the function. The arrival of AI agents that can perform the same routing has made the distinction visible, and the result is a reclassification of work that many professionals experienced as expertise.

This reclassification is neither fair nor unfair. It is structural. A marketing manager who spent years developing the ability to coordinate complex cross-functional campaigns was doing skilled, hard-won work. The skill was real, the experience was hard-won, and the outcomes were often excellent. The problem is that the skill, when decomposed into its constituent operations, turns out to be largely replicable by systems that perform pattern-matching, information routing, and exception flagging across structured data. The expertise was in the orchestration, and orchestration is precisely the territory where AI agents operate most effectively.

The practitioners who navigated this layer successfully and who will navigate the next phase of their careers successfully are the ones who can distinguish between the coordination function they performed and the strategic judgment they developed in the process of performing it. The coordination function is automatable. The judgment, the part that involves reading organisational dynamics, anticipating political obstacles, understanding which stakeholders need what and why, is not automatable in the same way, because it depends on contextual understanding that AI systems do not currently accumulate.

The question for anyone leading a marketing team right now is whether the restructured organisation will create roles that preserve and develop that judgment, or whether the efficiency logic that drove the restructuring will continue to compress the entire middle layer until the strategic judgment is lost along with the coordination function it used to be embedded in. The marketing manager who knew that a particular product launch needed a different messaging approach for the European subsidiary, because she had been in the room for the last three product launches and remembered what worked and what did not, carried knowledge that no project management agent currently captures.

What happens to the organisational brain

The deeper structural question is about what kind of marketing organisation emerges when you remove the layer that was, without anyone designing it this way, functioning as the team’s distributed memory.

A VP of Marketing operates on abstracted information: pipeline dashboards, attribution reports, quarterly strategy decks. Individual contributors operate on specific, task-level information: this campaign, this landing page, this email sequence, this customer segment. The middle layer was the place where those two levels of abstraction met, where strategic intent was tested against operational reality and where operational detail was synthesised into strategic insight. It was, in a meaningful sense, the organisation’s connective intelligence.

Replacing that layer with AI systems that route information efficiently but do not synthesise it contextually produces an organisation with a specific kind of cognitive architecture: excellent at executing known patterns, efficient at distributing work, fast at aggregating data, and progressively less capable of the kind of lateral, contextual, experiential reasoning that allows institutions to navigate situations that do not match established patterns.

The organisations that manage this transition well will be the ones that recognise what they are actually restructuring. The routing function is replaceable. The institutional intelligence that lived inside the routing function is not, at least not by the same systems, and building new mechanisms to develop and preserve that intelligence requires understanding that it existed in the first place, which most organisations have not yet done.

What keeps coming back to me is that the speed of this restructuring is itself diagnostic. If replacing a layer of human workers with AI agents were structurally difficult, the transition would be slow, contested, and incremental. The fact that it is happening rapidly, across industries, with relatively little operational disruption in the short term, tells us something about the nature of the work that was being performed. The middle manager was closer to the algorithm than the org chart suggested. The consequences of that revelation are still arriving, and the organisations experiencing them are only beginning to understand what they have optimised away.


Stay in the loop

Essays on growth strategy, brand, and building at the frontier. No spam.