The Day I Accidentally Built Surveillance Software
Why I created “Burnt Toast”, and why it scared the hell out of me
I had a bit of an existential crisis this week. Not a philosophical one, a technical one. Which, in my line of work, might actually be worse. Because sometimes when you build things long enough, you eventually stumble into a very uncomfortable realisation: the technology we casually build every day can become surveillance systems incredibly easily. And I mean stupidly easily.
This whole thing started with something relatively harmless. I wanted to prove a point, a very simple one:
You do not need to track people to get useful data about how a website performs.
Seems obvious, right? Apparently not. Because most analytics tools on the internet operate under the assumption that the only way to understand behaviour is to profile the human being behind the browser. Track the user. Fingerprint the device. Store identifiers. Link sessions. Somewhere along the way the industry decided this was normal.
So I started building my own analytics engine. Just a little project, a JavaScript file, a few endpoints, some structured events. Something lightweight I could use inside my own systems to understand what was happening on websites. Nothing crazy.
Or so I thought.
The moment I realised something was wrong
While migrating some infrastructure recently I had a weird gut feeling. You know that developer instinct where something in the back of your brain goes:
“You should probably check that.”
So I did. I started digging into what the system was actually storing, and what the system could store. Then I made the mistake of asking AI to help optimise the data model. That’s when things got… uncomfortable.
Because once everything was laid out clearly, I realised something:
I had accidentally built a very powerful surveillance tool.
Not intentionally. Not maliciously. But the architecture was capable of it, very capable.
The moment I looked at the raw structure of what was technically possible, it hit me all at once: IP addresses, query strings, form submissions, interaction events, device data, mouse movement, session behaviour. Individually these things look harmless.
But together?
Together they can become a unique fingerprint of a human being.
Which is exactly the point where analytics quietly becomes surveillance.
And that was the moment I went:
“Oh. Oh no.”
Meet Evil Toast
So for a brief moment in time, the internal prototype earned a new nickname: Burnt Toast. Or sometimes Evil Toast. Because if you really wanted to build something invasive, the blueprint was already there.
And here’s the scary part.
It wasn’t complicated. It wasn’t some massive engineering effort. It was basically just a JavaScript file, an event pipeline, and a database.
That’s it.
Which leads to a slightly terrifying realisation. If I can accidentally build something like this while experimenting, someone else has absolutely built something like this intentionally. Probably thousands of times.
And those scripts are probably already floating around the internet, injected into compromised websites, embedded into dodgy plugins, sitting inside marketing tools nobody audits, collecting far more information than people realise.
That thought alone was enough to make me pause for a moment and go:
“Okay, this is actually kind of horrifying.”
Because every time you visit a website, your browser happily hands over a ridiculous amount of information without you ever thinking about it. Screen size. Browser version. Timezone. Language. Navigation path. Referrer. Interaction timing. Sometimes even form data.
You click a page and your browser basically says:
“Hello website, here is a small biography about the human operating me.”
And we all just pretend that’s normal.
The legal reality check
Then came the second realisation, which was less philosophical and more legal.
Australia’s privacy laws define personal information extremely broadly. If a person can be reasonably identifiable, it counts.
Which means things like IP addresses, device characteristics, behavioural patterns, form data, and query strings can quickly cross into personal information territory.
Which means if you collect them improperly, congratulations:
You’ve just built software that regulators will absolutely hate.
And I happen to work with industries that care about this a lot, healthcare, community organisations, churches, government-adjacent organisations. You do not want to show up to those environments with software that looks like an advertising surveillance stack.
So suddenly the project had a new objective.
Not optimisation.
Restraint.
The weird solution
The solution was strangely simple.
First I built the worst possible version. Then I castrated it. Lobotomised it. Stripped it down to the point where it could no longer behave like surveillance software.
Remove IP storage. Remove fingerprinting. Remove form payload capture. Remove persistent identifiers. Remove cross-session tracking. Remove anything that could reasonably identify a person.
What remained was something surprisingly powerful.
Because once you remove identity capture, the data becomes cleaner. You stop chasing individuals and start analysing systems, pages, performance, conversion bottlenecks, engagement patterns, technical issues, actual website behaviour.
Not human dossiers.
And suddenly the analytics engine became something else entirely. Something I’m actually comfortable deploying.
The new version: TOAST
The new version simply became TOAST.
Privacy-first analytics. No fingerprinting. No personal identifiers. No creepy behavioural replay. Just structured signals about how websites perform.
Which, it turns out, is actually enough to make good decisions. Possibly even better decisions. Because the data becomes consistent instead of messy.
Instead of obsessing over individuals, you focus on patterns.
And patterns are what optimisation actually needs.
The bigger lesson
The real takeaway from this whole experience wasn’t technical.
It was philosophical.
We are entering a world where building powerful technology is becoming absurdly easy. AI can now help optimise systems that once required entire engineering teams. Infrastructure is cheap. Data pipelines are trivial to spin up.
Which means the barrier to building powerful tracking systems has basically evaporated.
That should probably concern people a little more than it currently does.
Because the line between analytics and surveillance is much thinner than most people realise. And if you build software long enough, eventually you’ll run into that line yourself.
Trust me.
I did.
And the moment you realise how easy it is to cross it…
You start designing systems very differently.


