Stop being fed. Start thinking.
Why I’m using AI to escape the algorithm, not feed it
I turned 50 this year and I’ve noticed something uncomfortable about my own head.
A lot of what I believe about the world wasn’t decided by me. It was decided for me. Drip-fed through a timeline, a news app, a YouTube sidebar, a group chat. By the time an opinion arrived in my brain it felt like mine. It wasn’t. It was the residue of a thousand small nudges from systems designed to keep me scrolling, not thinking.
That bothers me. It should bother you too.
The trap is not the content. It’s the delivery.
The problem isn’t that social media shows you rubbish. The problem is that it shows you your rubbish. Endlessly. In flattering light.
Every platform is optimised for one thing: engagement. Not truth. Not nuance. Not your long-term wellbeing. Engagement. The algorithm learns what makes you react, and then it feeds you more of that. Outrage works. Confirmation works. Certainty works. Doubt and nuance do not work, so you see less of them.
Your brain does the rest. Repetition feels like truth. Familiar ideas feel correct. Opposing views feel like attacks. Psychologists have names for all of this — confirmation bias, the illusory truth effect, availability heuristic — but you don’t need the jargon. You only need to notice the pattern. You are being trained, every day, by a system that does not care whether you are right.
Here’s a small test. Pick a country you have strong feelings about. China, say, or Russia, or America. Now ask yourself honestly: where did those feelings come from? Did you visit? Did you read the primary sources? Did you talk to people who live there? Or did you absorb a mood, over years, from headlines and clips and posts written to make you feel a particular way?
If it’s the second one, you don’t have a view. You have a reflex.
The honest bit
I’ve been guilty of all of it. I spent years consuming news like it was a vitamin. I had strong opinions about places I’d never been and people I’d never met. I mistook being well-informed for being well-fed. They are not the same thing.
The moment I realised I’d been outsourcing my thinking to strangers with business models, something shifted. Not to the other extreme. I’m not about to start shouting about mainstream media lies or fall down a conspiracy hole. That’s just the same trap with different wallpaper. Both tribes are fed. They just eat different meals.
What I want is something older and quieter. I want to think for myself again.
AI as the way out, not the way further in
This is where most people get the story backwards. They assume AI is part of the problem — another algorithm, another feed, another system making decisions for us. And it can be, if you use it passively.
Used well, AI is the first genuinely useful tool we’ve had for pushing back against all of this.
Here’s the difference. A social feed decides what you see. A search engine ranks what you asked for. Both are doing something to you. A good AI model, prompted properly, does something with you. It will argue with you. It will give you the strongest case against your own position. It will surface evidence you didn’t know to look for. It will admit when it doesn’t know. It has no ad inventory to protect and no tribe to please.
That last bit is the key. When I ask an AI to steelman a view I disagree with, it doesn’t get offended. It doesn’t try to win. It just does the work. That is a strange and valuable thing in 2026.
The catch is that AI will happily agree with you if you let it. Ask a leading question, get a leading answer. Most people use AI as a mirror and then complain it only shows them themselves. The fault is not in the tool.
How I actually use it
Five things. Nothing clever. All of them work.
One. I ask for the opposite. Before I accept any view I hold, I ask AI to give me the strongest possible case against it. Not a straw man. The real thing, argued well, by someone who believes it. If my position survives, it’s probably mine. If it doesn’t, I needed to know.
Two. I ask for the sources, not the summary. Summaries are where bias hides. I want to know who said what, when, and with what evidence. “Give me the five most cited studies on this, including the ones that disagree, and tell me who funded them.” That prompt alone has changed my mind about more things than a decade of news consumption.
Three. I ask it to be a sceptic. “Act as a hostile reviewer. Find every weak point in this argument.” You cannot do this well for your own ideas. Your brain is on your side. AI isn’t on anyone’s side, which is exactly what you need.
Four. I check the AI. Models get things wrong. They hallucinate. They have training biases. So I treat every answer as a starting point, not a verdict. I ask “what assumptions did you make?” and “where might this be wrong?” Then I go and look at the primary source myself when it matters.
Five. I decide. This is the one people forget. AI is not there to give me my opinions. It’s there to make sure the opinions I end up with are actually mine — tested, evidenced, and held deliberately. The final call is always human. That’s the whole point.
The quiet payoff
None of this makes me clever. It makes me slower, which turns out to be the same thing in a world optimised for speed.
I notice I post less. I argue less online. I’m less certain about things I used to be certain about, and more certain about a smaller number of things I’ve actually thought through. My feed is quieter because I’m not feeding it. I read primary sources I’d never have found. I change my mind in public sometimes, which used to feel embarrassing and now feels like the whole job.
The strange side effect is that I trust my own views more, not less. Because I know how they got there.
This is the position I want to stand on
I run an AI consultancy. I could tell you AI is going to transform your business, ten-x your output, free up your weekends. Plenty of people are already saying that, loudly, and some of it is even true.
But the thing I actually care about, the thing I’d want on my gravestone before any of the business stuff, is this: in an age where almost everything you see has been chosen for you by a machine that doesn’t have your interests at heart, the most radical thing you can do is think for yourself. And the best tool we have ever had for doing that is now sitting on your phone.
Use it properly and you get your mind back.
Use it lazily and you just have a more articulate feed.
The choice, for once, is actually yours.


