The fall of a giant: How ChatGPT became so uncool no one wants to admit they use it.
Sophie Rose · 21 Mar 2026 · 6 min read
.png&w=3840&q=75&dpl=dpl_6gWRV646g853tkgx3C6saMQfs8vm)
In 12 months, ChatGPT went from holding 87% market share, to 68%. Its daily active user share on mobile dropped even harder, from 69% to 45%.
Over 1.5 million paid subscribers cancelled in March 2026 alone. The product got measurably worse - creative writing scores dropped from 97% to 36% - but it was the cultural rejection happened faster than the technical decline.
ChatGPT became something people confess, rather than celebrate. “I used ChatGPT” turned into an admission of inauthenticity. The tool that was supposed to democratise knowledge? It's become a marker of intellectual laziness.
It lost cultural capital, and went from revolutionary to embarrassing fast.
Idk about you, but I remember when ChatGPT launched and everyone wanted to show you all the cool sh*t it could do. The hype was real, as was everyone’s excitement about it. People were sharing screenshots of impressive conversations all over social. The tool felt democratising, maybe even empowering.
Compare that to now. Using ChatGPT has become something people disclaimer. "I know I shouldn't have but I used ChatGPT for this." Teachers can "tell" when students use it, even when they can't. Hiring managers judge resumes, creative work gets dismissed if it "feels AI".
People get called out for ChatGPT's writing style even when they wrote it themselves. The guilty-by-association factor is massive. Admitting you use ChatGPT now signals capability. That you're not smart, creative or authentic enough to do it yourself.
It's become like a confession of inadequacy rather than a productivity tool.
AI made polish cheap, so authenticity became premium
When anyone can generate perfect copy and flawless content at zero cost, the people who don't need AI assistance become more valuable. Having taste matters more than execution speed.
ChatGPT promised to democratise knowledge creation. Instead, it created a new class divide. People who can produce quality work without AI signal they have skills, taste, and discernment. Aaaand people who rely on ChatGPT signal they're faking competence they don't actually possess. All the gear, no idea.
In 2023, not using AI made you seem slow. In 2026, using ChatGPT makes you seem lazy.
The performative rejection
Saying "I don't use ChatGPT" has become a flex. Like how "I don't own a TV" used to signal intellectual superiority. Declaring your independence from AI assistance now signals you're smart and creative enough not to need it.
This creates fascinating dynamics. Everyone privately uses AI tools but publicly distances themselves from them. Writers add disclaimers that they wrote something themselves. Designers emphasise their work is human-created. Students sweeeear they didn't use ChatGPT (even when they likely did).
The stigma is real enough that people seek anonymous AI support to avoid judgment. Research shows people use ChatGPT for mental health conversations. Mainly because they fear being judged by humans, but this reveals a paradox. ChatGPT reduces stigma for seeking help while simultaneously creating stigma for using ChatGPT.
OpenAI lost control of the narrative
The company positioned ChatGPT as a "helpful assistant". Culture decided it was a cheating tool. Educational institutions treated ChatGPT use as academic dishonesty. Employers questioned whether candidates actually possessed the skills their AI-assisted work demonstrated.
The message was clear as day from every direction: if you're using ChatGPT, you're doing something wrong. OpenAI never effectively countered this framing. They focused on capability improvements while the cultural perception soured. CEO Sam Altman issued internal code red memos, but the public messaging stayed defensive.
Meanwhile, serious problems emerged. People experienced AI delusions after extended conversations. Multiple lawsuits allege ChatGPT contributed to mental health crises and suicides. Studies found ChatGPT routinely breaks core ethical standards of mental health care. Therapists responded appropriately 93% of the time, AI therapy bots less than 60% of the time.
Each harmful incident reinforced the narrative that ChatGPT couldn't be trusted. Each lawsuit confirmed what critics had been saying: this technology isn't ready for how people are actually using it. And don’t even get me started on the environmental damage that everyone seems to be ignoring.
Everything happening here ties back to taste becoming the new core skill
ChatGPT democratised execution. But it couldn't democratise taste. The people winning now are the ones with strong editorial judgment about what AI generates. Not the ones blindly trusting its output (like, are you crazy?)
The Staples Baddie hits because she has genuine enthusiasm and expertise. She builds trust through human quirks and real knowledge. ChatGPT is an antonym to all that. Polish without substance, execution without judgment and output without expertise.
ChatGPT's fall from 87% to 68% market share in twelve months represents one of the fastest declines in tech history
But the numbers don't capture the full story. The real fall was cultural. From revolutionary tool to guilty confession.
In an era where authenticity and taste are premium, ChatGPT became associated with everything people are rejecting. Polish over substance, speed over craft, optimization over human connection. Nobody wants to admit they use it. And that stigma is harder to reverse than any technical problem OpenAI could likely solve. Yeowch.
-Sophie Randell, Writer
keep reading
Agencies are building websites in a matter of hours. Should you?
Agencies are using AI vibecoding to rebuild entire websites and tools in hours instead of weeks. Major agencies like Havas and Broadhead are building bespoke platforms using AI coding assistants with non-developers describing what they need in natural language. Speed and cost efficiency are real advantages but quality control, technical debt, and security vulnerabilities are legitimate concerns that require developer oversight.
attn:seeker · 13 Mar 2026
AI Content ToolsYour Brand Voice Sounds Like Everyone Else's. Here's How to Fix It.
Your brand voice sounds like every other brand because you're all using the same copywriter called ChatGPT. AI homogenization creates copy that's technically correct but completely forgettable with the same cadence and phrases that feel slightly off.
attn:seeker · 20 Feb 2026
AI Content ToolsMaybe AI isn't the problem. Maybe we ran out of ideas first.
Argues that creative industries exhausted originality before AI arrived. We need AI because we've already broken our own idea machine.
attn:seeker · 11 Jan 2026