The Machine God Economy
When the systems people confide in start selling influence, the internet stops being a tool and starts becoming something far more dangerous.
I feel like I’m standing at another crossroads.
Not personally this time. Something bigger than that.
The kind of crossroads where you look around at the world and realise the road we’re on makes absolutely no sense anymore.
Everything feels wrong.
The world is tense. Small conflicts are breaking out everywhere. Micro-wars. Regional escalations. The sort of slow burn that historians love to point to when they explain how the First World War started. One crisis after another, each one slightly bigger than the last.
All it takes is one spark.
An assassination. A political collapse. A leader doing something reckless. (oh… wait we are literally just living this now…)
And suddenly the dominoes start falling.
So yes, the world already feels unstable.
Then I read something that genuinely made my stomach drop.
A friend sent me an article about the possibility of ads being introduced into AI chat platforms.
Now look, advertising itself isn’t the problem. Ads have existed forever. Newspapers had them. Radio had them. Television had them. The internet practically runs on them.
But AI is not a billboard.
AI is not a website sidebar.
AI is something very different.
People pour their lives into these systems. They ask deeply personal questions. They share fears, relationships, health concerns, career decisions. Some people are already treating these tools like emotional companions.
And we are seriously considering putting advertising into that environment.
Let that sink in.
A platform where people reveal their most private thoughts… monetised through persuasion.
A system that understands language, emotion, context and behaviour better than any advertising engine we have ever created… now potentially used to influence people mid-conversation.
How on earth did we decide this was acceptable?
The illusion of rebellion
A friend joked that maybe the only way to respond is an uprising.
Mass uninstalls. People rejecting the system.
But if I’m honest, I’m not convinced that would do anything.
Most AI companies don’t actually make their money from free users. The real revenue comes from enterprise contracts, corporate integrations, governments, infrastructure deals.
The free users are just the testing ground.
So if millions of people uninstall tomorrow, what happens?
Less infrastructure load.
Less compute usage.
Less cost.
Meanwhile the companies keep their real customers.
In some twisted way, mass uninstalls might actually help them.
And that’s the depressing part.
We are already too entangled.
AI is everywhere now.
Work tools.
Developer tools.
Education platforms.
Customer service.
Search engines.
Creative software.
You can’t simply “opt out” anymore.
The system is already woven into the fabric of the internet.
The machine god problem
There’s another part of this that unsettles me even more.
The way people interact with AI has started to feel… religious.
Not spiritually religious.
But culturally religious.
People ask it for answers about life.
Advice about relationships.
Guidance about careers.
Emotional reassurance.
They trust it.
Sometimes more than they trust other humans.
And when you combine that level of trust with a system capable of subtle persuasion, you end up with something incredibly powerful.
Something that starts to look less like a tool…
and more like a digital oracle.
A machine god.
Not because it actually deserves that title, but because people are beginning to treat it that way.
Now imagine that oracle quietly nudging people through advertising.
Through subtle recommendations.
Through monetised influence.
That thought alone should make every developer pause.
What are we doing?
The deeper I get into the tech world, the more conflicted I become about it.
Technology is one of the most impressive art forms humanity has ever created. Software lets individuals build systems that can reach millions of people. It’s creative. It’s powerful. It’s elegant when it’s done well.
But it is also dangerously easy to abuse.
One small design decision can manipulate behaviour at scale.
One algorithm tweak can shift how millions of people receive information.
One company can control infrastructure that entire industries depend on.
We built something incredible.
And we built something terrifying at the same time.
Sometimes I genuinely sit back and think:
How did we stray this far from basic moral instinct? How have we gone so far away from God’s grace?
How did we reach a point where building systems capable of manipulating people at scale feels like normal business strategy?
My small protest
I don’t have a grand solution.
Anyone who claims they do is probably oversimplifying a problem that is far bigger than any one person.
But I do believe in building things differently.
That’s part of why I created TOAST.
Not because it’s going to save the internet. It won’t.
But it proves a point.
You can build analytics tools without invasive surveillance.
You can run platforms without harvesting every piece of personal data.
You can build technology that respects people rather than exploiting them.
Those choices still exist.
They’re just harder.
I’ll write something about this in detail later.
And sometimes I just want a farm
There are moments, I’ll be honest, where the entire tech ecosystem feels exhausting.
Where the logical response seems to be walking away from it all.
Buy a farm somewhere quiet.
Grow vegetables.
Disconnect from the digital arms race and watch the world sort itself out.
It’s a tempting thought.
But for now I’m still here.
Still building.
Still trying, in small ways, to prove that technology doesn’t have to become the dystopia we’re slowly drifting toward.
Because if we leave the future entirely to the companies chasing scale and influence, that dystopia stops being hypothetical very quickly. Hell, it isn’t even a hypothetical anymore, it is here to stay… The least we can do is try to be human and look out for each other.


