Ben PS

Technologist, Serial Creator, and Student of Intelligence—Both Artificial and Eternal

Is AI Turning Us Into System 1 Thinkers?

And why that might be more dangerous than we realize

In a world racing to automate everything from emails to strategy, a deeper question is emerging—quietly, but urgently:

Is artificial intelligence training us to think less?

To accept answers without challenge?
To prefer speed over depth?
To outsource judgment?

At the center of this concern is a deceptively simple concept from the world of behavioral psychology: System 1 vs. System 2 thinking.

Lets dive in

The Two Systems of Thought

In his Nobel Prize-winning work, psychologist and economist Daniel Kahneman introduced a powerful idea in his book Thinking, Fast and Slow. He explained that our brains operate using two different modes of thinking.

System 1 is fast, instinctive, and emotional. It’s what you use to catch a ball, finish someone’s sentence, or react to a sudden sound.

System 2 is slow, deliberate, and logical. It kicks in when you’re solving a math problem, writing a plan, or analyzing data.

Both systems are essential. But the problem is, we default to System 1 far more often—because it’s easy. It feels good. It saves energy.

Here’s a classic illustration.

A bat and a ball cost $1.10 in total.
The bat costs $1 more than the ball.
How much does the ball cost?

If you said “10 cents,” that’s System 1 jumping in.

But that answer is wrong.

The ball actually costs 5 cents. If it cost 10, the bat would be $1.10—and together they’d cost $1.20. The correct answer comes only if you slow down and do the math—engaging System 2.

And that’s exactly what’s starting to disappear in our daily work lives.

AI as Our New System 1

Tools like ChatGPT, Claude, Gemini, and Copilot have rapidly become part of how we write, plan, respond, brainstorm, and create. They generate drafts, suggest edits, simplify data, and even produce visuals.

They are fast, fluent, and—critically—confident.

In many ways, AI now plays the role of System 1 for our businesses. It gives us the quick answer. The shortcut. The template. The summary.

And it does it so well, we rarely stop to ask:

Is this right?
Is this appropriate?
Is this even true?

The danger is subtle but real: we’re getting used to not thinking deeply, because the first answer feels good enough.

That is the essence of System 1 dominance—and AI is reinforcing it at scale.

What We’re Losing When We Stop Thinking

When we rely too much on AI-generated outputs, several things start to fade:

Critical thinking
We stop questioning. We go with what “sounds smart.”

Judgment
We forget how to weigh context, nuance, and risk.

Skill
We lose the ability to write, plan, and analyze with depth.

Accountability
We accept the answer, even when we don’t fully understand it.

None of this happens overnight. But day by day, task by task, we slowly dull our own intelligence—without realizing it.

Real-World Example: A Business Owner’s Trap

Imagine you are working for a boutique creative agency. A client emails with a request for a proposal.

You ask ChatGPT to draft it. It produces something polished and persuasive. You glance through it. Looks good. You hit send.

But in the AI’s response:

  • The tone is slightly off-brand.

  • It recycles an outdated pricing structure.

  • It misses a specific request the client had outlined.

You didn’t catch it—because you didn’t really read it. You trusted the System 1 output. And now your reputation is taking a hit.

This is not a hypothetical. This is already happening across industries—writing, law, consulting, marketing, recruiting, even engineering.

So What’s the Solution?

It’s not to stop using AI. That would be like refusing to use calculators because we fear losing mental math.

The solution is to recognize that AI is your System 1 assistant—but you still need to be System 2.

Here’s what that looks like in practice.

Pause before you trust.
When AI gives you an answer, treat it like a smart intern. Helpful—but not infallible. Ask yourself, “Does this make sense in my context?”

Always edit.
Never send out AI’s first draft unedited. Make it yours. Challenge it. Sharpen it.

Balance speed with sense.
Let AI do the fast lifting—but make time for your own slow thinking where it matters most: strategy, tone, nuance, risk.

Keep training your thinking muscle.
Engage in deep work regularly. Write, reason, and solve problems manually when possible. Use AI as a tool, not a crutch.

My take

AI is not the enemy of intelligence. But it’s a tempting substitute for it.

The greatest threat is not that AI becomes too smart. It’s that we stop using our own.

In a world full of fast answers, the competitive edge belongs to those who know when to slow down and think for themselves.

So use the tools. Embrace the speed. But never stop asking:
Is this just System 1 talking? Or have I really thought this through?

Leave a Reply

Your email address will not be published. Required fields are marked *