Michael Levin, a developmental biologist, once trained planarian flatworms to navigate a simple maze. Then he cut off their heads. When the worms regrew their brains, they still remembered the route. 

It shouldn’t be possible. Memory, we thought, lives in the brain. Yet these worms stored it somewhere else entirely. Levin’s insight was unsettling: the categories we use to explain the world often misdescribe it – and we notice only when they break. 

Professional life isn’t so different. 

For decades, we organised work around tidy distinctions – the lawyer, the analyst, the consultant, the strategist – built on the idea that expertise sits in neat boxes: that specialisation creates value, and complexity ensures job security. These beliefs shaped careers, pay, and status. 

AI exposes their fragility. 

The professional hierarchy we imagined – routine tasks at the bottom, specialised expertise above, strategic judgment at the summit – was never a true map of human capability. It was a map of computational difficulty. Jobs were deemed skilled because machines couldn’t do them. 

That illusion is gone. 

Where humans see boundaries, AI sees a single substrate: symbol manipulation. Contracts, regulations, policies, models – all become the same kind of input. The silos that once defined professional identity were conveniences of the era now passing. 

The shift is visible everywhere. A financial analyst realises that decades of pattern-recognition advantage can now be matched instantly. An ESG specialist discovers that taxonomy fluency – once rare enough to command premium fees – can now be generated on demand. Even the strategist’s prized ‘sector intuition’ evaporates against systems trained on every annual report, earnings call, and analyst note ever released. 

Many senior professionals now grasp that the advantage they spent twenty years compounding can be reproduced in sixty seconds. The reflex is to burrow into narrower niches – more arcane rules, more specialised models. Like Levin’s worms, they remain organised around an old memory, a definition of expertise the world has already outgrown. 

Institutions have yet to catch up. Law firms still bill by seniority. Consultancies still promote for sector expertise. Universities credential for the stockpiling of technical knowledge. The result is a widening gap between what organisations reward and what actually creates value. 

Expertise was never the accumulation of information. It was the capacity to change how someone else understands their own problem. 

Consider the consultant who sits with a CEO paralysed by a decision her team has debated for months. The breakthrough is not more data – it is the conceptual reframing of her problem: “You’re not choosing between plans. You’re choosing the company you want to be in five years.”  

Or the board director who, by building trust in a fractured room, makes it possible to choose the option everyone privately knows is necessary. Not through superior slides – through the right words at the right moment. 

Boards do not shift direction because a model tells them to. Chief executives do not abandon strategy because a dashboard blinks red. Communities do not grant legitimacy because a spreadsheet is elegant. 

They move because a human they trust helped them make sense of the world. Seen clearly, the old hierarchy falls away. Memorising knowledge, collecting rare facts, and being the expert in the room mattered only when information was scarce. In a world of instant synthesis, the scarce skill is sensemaking: the ability to frame problems, surface assumptions, and turn complexity into decisions people can act on. 

What remains defensible is not a stack of technical competencies but a set of cognitive lenses: philosophical reasoning (spotting hidden premises), critical thinking (distinguishing signal from noise), and systems thinking (seeing incentives, interdependence, and second-order effects). 

AI does not have these lenses. It has outputs. The gap between the two is where human value now sits: in communicating insight in a way that changes another person’s mental model. We pretend major decisions are analytical. In truth, they’re narrative – shaped by the emotional intelligence to land the right words at the right moment. 

We treated communication as a soft skill; it is becoming the defining one. It is the form expertise takes when it matters: cognition made public. It is how humans align understanding and reduce uncertainty. When AI can generate any argument in seconds, the rare skill is the ability to change another mind. 

If the old categories are breaking, what replaces them? 

A new map of value, based not on hoarding technical knowledge but on interpretation, relationships, and synthesis – the capacities that turn information into understanding, and understanding into action. 

The scarcest professionals of the coming era won’t be those who know the most, but those who help others see most clearly. 

Expertise has always been relational. AI has simply removed the option to pretend otherwise.

We’d love to hear your thoughts – email luke@bwdstrategic.com or message him on LinkedIn if you’d like to continue the conversation.

About the Author

Luke Heilbuth is CEO of sustainability strategy consultancy BWD Strategic, and a former Australian diplomat.