
He said the better your leadership skills are, the better you'll be able to utilise AI.
That's when I realised a new kind of class divide is emerging, not between the people who use AI and those who don't, but between the people who know how to direct it and those who get directed by it. It's what I call The Prompt Gap.
We're entering an era where everyone technically has access to the same tools, but not everyone has the ability to get something worthwhile out of them. It's not about being "good at AI." It's about being good at thinking.
The clearer your thought process, the better your result. And just like with any human teammate, if your instructions are vague, you'll end up with a half-baked mess that only makes sense in your head. (And definitely can't be used on a slide deck.)
That's why the people who thrive alongside AI aren't the ones who offload everything to it and call it a day; they're the ones who know how to collaborate with it.
Most people think prompting is just knowing which words to use. But the real skill is knowing what you actually want before you start typing. That's a thinking discipline, not a tech one.
If you've ever managed a team, you'll recognise the pattern: people who give great briefs get great results. It's the oldest rule in the playbook. People who don't spend the rest of the week cleaning up chaos.
If you're clear, contextual, and decisive, it becomes a creative accelerator. If you're lazy or fuzzy, it becomes a mirror reflecting your confusion right back at you, but faster and more convoluted.
That's why I think of prompting as a leadership exercise. It's about learning to think in structure: here's what I need, why it matters, and what the success of said thing looks like. The more competent you are at articulating those things, the better you'll perform in an AI-powered world, because you'll know how to steer the ship, not just sit in the passenger seat yelling "make it better."
There's a creeping kind of cognitive decay happening, what I lovingly call "AI brain rot." The more you let the model do your thinking for you, the less capable you become of doing it yourself.
Shortcuts are great until they start rewiring your instincts. If you always rely on AI to summarise, ideate, or structure your thoughts, you slowly lose the ability to hold complexity in your head. The muscle of original thought: pattern recognition, synthesis, creative leaps etc, starts to weaken.
And when that goes, so does your competitive edge.
The goal isn't to eliminate cognitive effort-it's to reallocate it. You let AI handle the mechanical stuff so your brain can focus on strategy, context, and creative judgment.
If you skip the thinking altogether, you're not collaborating with AI. You're being replaced by it.
We spent decades worshipping technical skills like coding, analytics, UX design. But as AI eats up more of the technical work, the premium shifts to something much older and rarer: communication.
Because AI doesn't think for you (at least, it shouldn't). It should scale your clarity. And clarity, in this case, comes from discipline, curiosity, and emotional intelligence. All things that can't be automated.
For the first time in history, we're watching the great flattening of skill hierarchies. Everyone can now produce, edit, analyse, and design at some baseline level. But only a few can orchestrate: to take a messy idea and guide it into something coherent and valuable.
That's the new elite class: the AI-literate thinkers. The ones who use prompting as a cognitive tool, not a crutch.
The future doesn't belong to the people who know the fanciest AI tools. It belongs to the ones who know how to brief.
Learn to think clearly, ask precisely, and refine relentlessly. That's the work now. Because as it turns out, the great equaliser of technology has made one skill matter more than ever: the ability to ask better.
-Sophie Randell, Writer