Disclaimer: AI assisted in the development of this blog article. But my process entails the following:
I record a 15-20 min audio of my take (with an app I, myself, vibe-coded, and which you can use for free) via speechtotext.today
I then bring that transcript over to ChatGPT and ask AI to structure my thoughts into a readable document.
I then have it write an article about it.
I finally proofread, and EDIT the article, and then build the blog article myself in HubSpot.
I'm both frustrated and disappointed at the times we're living.
People that I think highly of are using AI more and more. Not just to produce work, but to outsource their brain, attention, involvement, care...
In 2026, I've been through at least 15 instances of clients, teammates, collaborators, bosses sending AI-written tasks, messages, whole instructions, and documents my way. Without even having read them.
How do I know?
Because their instructions are not contextual.
I jump into several HubSpot portals every week.
Each has its own infrastructure and architecture set up in a way that requires careful attention to detail to grasp what is going on.
And also, even if each portal has a specific architecture built on it, the way that it is being utilized by the sales, marketing, or customer service teams might differ. This is an entire layer of context that also matters when assigning tasks.
Yet, when I get these AI-generated instructions, with a ridiculous deadline attached to them, and then, I jump into the portals only to find out that all of those AI-generated instructions could literally mean ANYTHING inside the client's portal... I get frustrated.
This is my take on the day we stopped thinking at work...
There’s something happening in the workplace right now that no one is really talking about.
And it’s not automation.
It’s not job loss.
It’s not even AI replacing people.
It’s something quieter… and honestly, more dangerous.
It’s people replacing themselves.
I work in marketing, but collaborate with folks in sales, customer service, CRM, the whole business ecosystem, all year long.
I’m not speaking as:
Today, I’m speaking as a human being watching something go wrong in real time across the board.
And I’m seeing it everywhere.
People are making more mistakes than ever before.
Not because they’re incapable.
Not because they lack knowledge.
But because they’re no longer engaging with the work they’re doing.
Here’s what’s happening:
People are using AI to generate instructions…
and then assigning those instructions to other humans…
without even reading them.
Let that sink in.
Let’s be clear, AI isn’t the enemy.
Using AI to assist your thinking? Great.
Using AI to move faster? Amazing.
But that’s not what’s happening.
What’s happening is this:
They’re prompting… copying… pasting… and sending.
And generating more confusion for themselves and others.
We’re entering a new kind of workplace dynamic:
And that imbalance is creating chaos.
Because when no one actually understands the work:
And the worst part?
Everyone pretends everything is fine.
Here’s the uncomfortable truth:
The problem isn’t that AI is replacing humans.
It’s that humans are choosing to step aside.
When you:
You’re not using AI.
You’re outsourcing your brain.
There’s a deeper consequence here that no KPI will show.
If you let AI think for you consistently:
Today is the smartest you will ever be.
Because from that point forward, you’re no longer practicing:
You’re slowly losing the very skills that make you valuable.
Not overnight.
But steadily.
Quietly.
This isn’t about rejecting AI.
It’s about reclaiming responsibility.
Here’s a simple reset:
If you didn’t read it, don’t send it.
No exceptions.
AI output should be a draft, not a decision.
Think back to a time when you:
That version of you still exists.
Use it before it dies completely.
AI should:
AI is getting smarter every day.
But that doesn’t automatically mean we are.
And if we’re not careful, we’ll end up in a world where:
That’s not progress.
That’s erosion.
If you’re using AI today, good.
But ask yourself one question before you hit send:
“Did I actually think about this?”
If the answer is no…that’s where the real problem starts..