From Survival to Stagnation: Why Your Brain Loves AI – and Why That’s a Problem

AI brain

Every time I fire up LinkedIn lately, I’m met with another post about AI.

Infographics, carousels, reels, webinars — all promising to help you “stay ahead of the curve.” There’s no shortage of breathless headlines like:

  • “10 ChatGPT Prompts That Will Replace Your Marketing Team”
  • “How I Used AI to 10x My Productivity and Still Made My Pilates Class”
  • “This AI Stack Will Save You 17 Hours a Week (No, Really!)”
  • “If You’re Not Using These 5 AI Tools, You’re Already Behind”

It’s a bit of a gold rush. And look, I get it — AI is exciting, powerful, transformative.

But it’s also emerging in a world that’s already obsessed with speed.

We’re under constant pressure to do more, deliver faster, and stay “always on” — often with fewer resources and tighter budgets. In that environment, tools that promise instant answers, effortless creativity, and faster output feel less like a luxury and more like a necessity.

And yet… something feels off.

In the middle of all this noise, there’s something we’re not talking about enough:

What if our growing reliance on AI — for answers, ideas, even decisions — is slowly dulling the very cognitive skills we once relied on to thrive?

That’s the thought that led to this piece.

Because if we’re honest, we’re not just using AI to boost productivity — we’re using it to avoid thinking. And that should give us pause.

 

The Lazy Brain is a Smart One (Sort of)

Let’s rewind 200,000 years.

Back then, your brain’s job wasn’t to be innovative or curious — it was to keep you alive. And thinking burns calories. A lot of them. In fact, your brain uses around 20% of your body’s energy, even at rest.

So evolution gave us a survival trick: mental shortcuts. If we could conserve energy by avoiding unnecessary effort, all the better. That wiring lives on today — we crave simplicity, repetition, ease.

In short: our brains are designed to be lazy. That’s not a flaw. It was once a survival feature.

 

Dopamine, Sugar, and the Reward System Hijack

But that same energy-saving instinct is also why we struggle with sugar, junk food, Instagram scrolls, and online shopping. These things hijack the brain’s reward system –  specifically the mesolimbic dopamine pathway — giving us quick hits of pleasure with minimal effort.

It’s the same pathway involved in addictive behaviours. Studies show that sugar bingeing, for instance, triggers cravings, withdrawal, and tolerance, not unlike drug addiction. Ultra-processed foods — loaded with sugar, salt, and fat — are designed to exploit this biology. Up to one in seven adults now show addiction-like responses to them.

These are not failures of willpower. They are features of a brain evolved for scarcity now dropped into abundance.

 

So What Does This Have to Do with AI?

Lazy-brain?AI is now offering us cognitive junk food — the Coco Pops of the tech world.

We no longer need to wrestle with ideas, draft outlines, or solve problems from scratch. We just ask a chatbot and move on.

And just like the pleasure of a doughnut replaces the nourishment of a meal, the satisfaction of getting an instant answer can replace the deeper cognitive benefits of figuring it out ourselves.

It’s short-term gratification dressed up as productivity — and the brain loves it. Every time AI relieves us of effort, we get a little dopamine spike: a reward. And like any reward, the more we chase it, the more automatic the habit becomes.

 

Games, Screens, and Cognitive Decline — or Protection?

Interestingly, not all tech is bad for the brain.

A massive UK Biobank study found that regular computer gaming was associated with better memory, faster reaction times, and a 19% lower risk of dementia. Another trial found that older adults with mild cognitive impairment who gamed three times a week showed measurable improvements in problem-solving and mental flexibility.

But the key? These were engaged, interactive, problem-solving activities.

In contrast, passive screen time — like mindless Netflix bingeing or idle gaming — was linked to poorer cognitive outcomes in older adults. It’s not the screen that matters. It’s the type of engagement.

 

AI: Tool or Crutch?

There’s no denying that AI can supercharge creativity and productivity. But the danger lies in how we use it.

Are we using it to:

  • Extend our thinking?
  • Collaborate and explore ideas?
  • Stress-test assumptions?

Or are we using it to:

  • Avoid effort?
  • Replace memory?
  • Skip the discomfort of not knowing?

A recent study on AI writing tools found that the more people used them, the less they engaged in reflective thinking and the more they offloaded cognitive effort. Over time, that could chip away at skills we rely on every day: problem-solving, critical thinking, even creativity.

 

What Are We Teaching the Next Generation?

Here’s the bit that really worries me.

We’re giving children and young adults unrestricted access to AI tools — often with no guidance on how to use them healthily. No understanding of what’s happening in the brain. No guardrails.

It’s like handing a kid a pack of cigarettes in the 1960s and saying, “Go on then, be smart.”

But maybe cigarettes aren’t the best analogy anymore. A stronger one?

AI is ultra-processed food for the brain.

It’s convenient, feels good in the moment, but when it displaces the “nutrients” of real thinking — the struggle, the creativity, the learning — we start to see the long-term cost.

We urgently need a culture of cognitive hygiene — one that teaches:

  • How to use AI with awareness, not blind dependence
  • When to lean on it, and when to push through discomfort
  • How to balance convenience with curiosity
  • And above all, the importance of discipline — because just like with food, overconsumption has consequences

AI isn’t inherently harmful, but using it constantly without thought or restraint can quietly erode our ability to think deeply, solve problems, and make decisions.

Because thinking — real thinking — isn’t obsolete. It’s what makes us human.

 

Final Thought: Evolution Isn’t Destiny

Yes, our brains are wired for ease, reward, and shortcuts. But they’re also capable of plasticity — of change, learning, and growth.

AI isn’t the enemy. Complacency is.

The brain will always love shortcuts. But our job now is to know when to take the shortcut, and when to take the long way round — because that’s where the growth happens.

 

Written with the help of ChatGPT — but also shaped by my own brain, curiosity, and prior knowledge from books and articles I’ve read on cognitive behaviour. I questioned what we’re doing, guided the AI toward the message I wanted to convey, and made deliberate choices about structure and tone. It took me just over an hour — it could’ve taken significantly less time if I’d let the AI do everything… and much longer if I’d started from scratch and without a clear idea of where I wanted it to go. But then again, that’s kind of the point.

– “Damobot”

Appendix: Research & References

  1. Sugar addiction and neural changes
    Rada, P. et al. (2005). Neuroscience & Biobehavioral Reviews
    https://pubmed.ncbi.nlm.nih.gov/15987666/
  2. Ultra-processed food and addiction
    Gearhardt, A. N. et al. (2023)
    https://www.theguardian.com/food/2023/oct/12/its-like-trying-to-quit-smoking-why-are-1-in-7-of-us-addicted-to-ultra-processed-foods
  3. Gaming and dementia risk (UK Biobank study)
    https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11186151/
  4. Gaming and cognitive performance (MCI study)
    https://pubmed.ncbi.nlm.nih.gov/40026301/
  5. Digital engagement lowers dementia risk
    https://nypost.com/2025/04/17/health/study-digital-tech-use-lowers-risk-of-brain-decline-by-42/
  6. Passive screen use and cognition
    https://www.thesun.co.uk/health/32552977/sitting-increases-dementia-risk-depends-on-activity/
  7. Cognitive offloading and AI writing tools
    Clark, J. M., et al. (2023). Societies, 15(1), 6.
    https://www.mdpi.com/2075-4698/15/1/6

Share this

Are you ready to supercharge your digital presence?