Every Customer a Suspect: Inside Australia’s AI-Powered Supermarkets
Facial recognition, AI checkouts, loyalty cards and why Aussie retailers are turning your weekly shop into a behavioural profile.
So I was talking to a friend the other day about something that, honestly, should be old news by now: supermarkets and big-box retailers quietly turning into data-harvesting surveillance machines.
At the time, we were talking about Coles, there was this whole wave of stories about them “tracking people in-store” to deal with shoplifting, and it had real Palantir energy to it. The vibe was: smart gates at the exits, AI systems flagging “suspicious behaviour,” cameras that follow you from checkout to door.
In my head that immediately turned into:
“Cool, so now Coles has facial recognition and a dossier on everyone who buys grapes.”
To be fair, that bit isn’t quite right.
Quick reality check
Coles is rolling out AI-powered smart gates and overhead sensors that track whether a person leaving has actually paid; but they’ve explicitly said they do not use facial recognition in their stores. It’s tracking movement, not faces.
Woolworths has gone hard on AI at self-checkouts: overhead cameras that spot mis-scans and dodgy weighing tricks, but they blur faces and pin pads, and say the system doesn’t do facial recognition or biometric matching.
Kmart, on the other hand, actually did go full “scan-every-face” mode, and the Privacy Commissioner has now ruled that what they did was unlawful.
Bunnings also used facial recognition across 60+ stores and got found in breach of the Privacy Act too. Hundreds of thousands of people scanned without proper consent.
The Good Guys dipped a toe in, got called out by CHOICE, and immediately hit the pause button while the regulator looked at it.
So the “oh my God, the store is watching you” thing isn’t just a vibe; it actually happened, and in some cases, got smacked down by the law.
And it got me thinking back to that conversation I had with my friend, where I was explaining:
why they’re doing all this,
how it works,
what it actually means,
and how there are perfectly good, non-invasive ways to optimise stores that don’t require turning everyone’s face into a dataset.
So this is basically that conversation, just written down and slightly less sleep-deprived.
Cameras? Fine. Facial Recognition? That’s Where I Draw the Line.
Cameras in stores aren’t new.
When I was in Woolies in my early twenties, I swear people were already talking about “advanced CCTV” and maybe-facial-recognition, and no one really cared because:
every store has cameras,
CCTV is boring now,
and we all kind of accept that if you rob a place, the footage will be used.
And honestly, there are legit reasons for basic cameras:
They deter theft.
They give you a timestamp when something dodgy happens.
They help identify actual criminals when things go properly wrong.
Plus: you’re on private property. When you walk in, whether you like it or not, you’re agreeing to the possibility of being filmed for security.
If I had a house full of cameras and something happened, you’d better believe I’d want that footage.
Where it goes from “fair enough” to “hang on a second” is this:
When they’re not just filming the store…
they’re tracking you as a person.
Your face.
Your movement patterns.
Your behaviour over time.
Your profile.
That’s the line for me.
I totally get why some people are full anti-surveillance and hate any cameras at all. I’m not quite there; I accept there are bad actors in the world and we need some protection. But when it jumps from:
“Help me catch the guy who stole the till”
to
“Track every step every customer takes, forever, just in case”
…yeah, no. That’s not it.
And that’s exactly the line Kmart and Bunnings crossed.
What Actually Happened With Kmart, Bunnings, and The Good Guys
Let’s talk specifics for a second, because this isn’t just vibes; there are actual legal decisions now.
Kmart: “We Just Wanted to Prevent Refund Fraud”
From June 2020 to July 2022, Kmart quietly ran facial recognition technology across 28 stores in Australia. The goal: crack down on refund fraud.
Not violent crime.
Not armed robbery.
Not organised gangs.
Refund. Fraud.
Here’s what the system did:
Filmed everyone entering certain stores
Scanned their faces
Turned those faces into biometric “faceprints”
Compared those against a watchlist of suspected refund abusers
Only kept the biometric data of people on the list (in theory), and deleted others later
The key thing: this is biometric information; specifically, face templates, which counts as sensitive information under Australia’s Privacy Act. Same legal category as your health data or your religion.
Sensitive info = you must get clear, informed consent before you collect it.
Kmart did not get that.
They argued:
“We don’t need consent. We’re preventing unlawful activity.”
There is an exemption in the law if you need to collect data to prevent unlawful activity or serious misconduct.
But the regulator basically said:
you didn’t show refund fraud was serious enough to justify doing this to everyone;
“abusive refund behaviour” is not the same as “serious unlawful activity”;
scanning every face walking through the door, just in case, is disproportionate.
So the Privacy Commissioner found that:
Kmart breached the Privacy Act by collecting sensitive biometric data without valid consent
Its signage and privacy disclosures weren’t clear or prominent enough to count as proper notification
Mass biometric surveillance for a relatively small number of bad actors was overkill; the privacy risk outweighed the benefit
Kmart has been ordered not to bring the tech back, and to publish the finding. They may still appeal, but the precedent is there.
Bunnings: “We Needed It for Safety”
Bunnings ran facial recognition across 62–63 stores between late 2018 and late 2021. Their line was:
“We only use it to identify banned people and protect staff and customers from violent and organised crime.”
The OAIC looked at it and went: still not okay.
The system scanned every face that entered; likely hundreds of thousands of people
It created biometric templates and checked them against a banned/wanted list
They did not obtain proper consent, and their notifications were inadequate
The Commissioner found they breached multiple Australian Privacy Principles and “interfered with the privacy of individuals”
Bunnings has been told to destroy the biometric data and not repeat the behaviour. They’re now trying to challenge the ruling at the tribunal, arguing the tech was essential to deal with real threats and repeat offenders.
Again though, the core logic from the regulator is the same:
You can’t scan everyone’s face just in case a few of them are trouble.
The Good Guys: “Yeah Okay, We’ll Pause”
The Good Guys trialled optional facial recognition in two Melbourne stores, got named in a CHOICE complaint to the regulator, and quickly went:
“We’re pausing this until the OAIC tells us what’s up.”
CHOICE called the whole thing “unreasonably intrusive” and potentially unlawful.
So The Good Guys backed off early. Smart move.
Meanwhile: Coles and Woolies
In the same time period:
Coles has gone heavy on smart gates and AI tracking, but insists it’s not doing facial recognition; it’s linking movement at checkout to the gates unlocking or staying shut if there are unpaid items. It’s surveillance, sure, but it’s about transactions, not identity.
Woolworths uses AI cameras at self-checkouts to detect mis-scans and “banana as brown onion” moments. Faces and PIN pads are blurred, they say there’s no facial recognition or biometric matching, and there’s signage with an opt-out path (use staffed checkout).
Privacy folks still don’t love it; the Guardian literally ran a headline about Woolies AI treating “every customer as a suspect”, but legally, Coles and Woolies are playing in a different sandbox to Kmart and Bunnings.
TL;DR on the law:
Basic CCTV + some AI → generally okay if you’re transparent
Blurred, non-identifying AI → less risky
Full-on facial recognition → “sensitive information”, needs serious justification + explicit consent
Kmart and Bunnings crossed the line
The Good Guys backed out early
Coles and Woolies are dancing carefully on the edge
The Privacy Act is old and vague, so each of these rulings is basically patching the law on the fly to deal with tech it wasn’t built for.
Right. Legal detour done.
Back to why this all feels so gross.
The Good Stuff: Non-Invasive Tracking That Actually Helps People
Here’s the thing: I’m not anti-data.
I’m a marketer.
My brain is wired for:
data
patterns
optimisation
“how do we make this perform better without blowing the budget?”
And if you care about small businesses, data can literally be the difference between surviving and folding.
Because small businesses:
have limited budgets
can’t afford enterprise AI platforms
don’t have teams of analysts
are already running on fumes
So anything that helps them optimise, done ethically, is a massive win.
And a lot of tracking tech has existed for ages in ways that are… actually fine.
You don’t need to track people individually.
You just need to track movement.
Heatmaps, Not Humans
Imagine a camera pointed down at the shop floor.
Instead of learning faces, it just watches:
motion
pixel changes
blobs walking around
Over time, you generate a heatmap:
where people walk the most
which sections they ignore
where they hover
where bottlenecks form
You don’t know who they are.
You just know how the space is being used.
No profiles.
No “Tomas likes aisle 3.”
Just: “this area is hot, that corner is dead.”
The Art Gallery Example
Let’s say you run an art gallery.
You install a camera, feed the footage into some simple software, and it spits out heatmaps of foot traffic and dwell time.
No facial recognition. No linked identity. Just movement.
Suddenly you can see:
People keep missing the small room at the back
One painting at the entrance gets heaps of attention
Another piece in a weird corner is basically invisible
People don’t realise they can go around that wall
Now you can:
move hidden works into high-traffic areas
add signage or change the flow
literally tell an artist: “Hey, your piece in this spot got a ton of views”
Everyone wins:
visitors have a better experience
artists get more exposure and feedback
you make the gallery feel intentional instead of random
And you did it without tracking a single identity.
The Vinyl Shop Example
Same idea, different setting.
You run a shop like Beatnik. You notice:
People bolt straight to the King Gizzard & The Lizard Wizard section
They walk right past the Beatles bin
With movement tracking, you figure out:
the Beatles are in a dead zone
King Gizz is the magnet
So you rearrange:
put Beatles where people naturally pass
piggyback off King Gizz traffic
Suddenly, Beatles sales go up.
Still no identity. No profiling. Just:
bodies moving through space
layout decisions informed by actual data
This is the kind of stuff I wish more people associated with “retail analytics” instead of “AI face scanner that judges you for buying Shapes.”
Where It Gets Cooked: When Stores Start Tracking You
Now here’s where things go from “pretty clever, actually” to “absolutely cooked” very fast.
Tracking movement is one thing.
Tracking individual humans is another.
And retail already has everything it needs to glue those together.
Enter: loyalty programs.
Loyalty Cards Are Already Tracking You
We all know this drill:
You sign up for a loyalty card
You tap it when you buy things
They log everything you purchase
You get emails like:
“Hey, you bought this six weeks ago, it’s on sale again!”
“Because you bought X, you might like Y.”
That’s standard now. You trade a bit of data for discounts, points, or targeted offers.
And honestly? On its own, it’s not the end of the world.
But now combine:
loyalty card data
in-store cameras
AI that tracks movement and objects
facial recognition
and third-party ad platforms
Now things get weird.
Enter: Tmart (Our Fictional Megastore)
We’re not doing this with Kmart directly because: lawyers.
So: Tmart.
Tmart is a big department store. Think hybrid of Kmart, Target, Costco, Big W.
They:
sell everything from food to kitchenware to clothes
have CCTV everywhere
have a loyalty card system
value “innovation” which is code for “we like data”
They already track:
what you buy
how often you buy
which store you visit
how much you spend
Standard stuff.
Now an AI company comes knocking:
“Give us all your CCTV footage.
We’ll track every person who enters, where they go, what they touch, what they put back, and then link that to your loyalty system.”
Suddenly:
your face gets matched to your loyalty profile at the self-checkout
that face becomes a permanent identifier
every time you walk into any Tmart, across any suburb, they know it’s you
they can replay your entire visit as a data trail
Every aisle.
Every pause.
Every product you picked up and put back.
Every time you stalked the snacks and then guilt-walked away.
That’s not optimisation anymore.
That’s comprehensive behavioural profiling.
Tomas and the Magazine
Let’s keep going with this.
Say I walk into Tmart.
I go to the magazine section.
I pick up New Scientist.
I flip through it. I like it. Then I see the price, $13, and go:
“Yeah nah, not paying that.”
I put it back. I walk out.
I didn’t buy it.
I didn’t search it online.
I didn’t tell anyone I wanted it.
But the system logged:
my face
the timestamp
my position in the store
the product I was holding
the fact I put it back
Then it ties that to:
my loyalty record
my email
my ad IDs via third-party partnerships
Next week, suddenly:
I’m seeing “New Scientist on sale at Tmart!” ads
my socials are feeding me science-magazine carousels
Google is showing me “subscribe and save” options
my brain goes, “Oh yeah, I was thinking about that…”
At no point did I tell anyone I wanted it.
They just inferred it from my body language and behaviour.
That’s the creepy bit.
The Data Doesn’t Just Sit There
The scariest part isn’t just that they know.
It’s that they use it.
And it doesn’t live in one neat little box.
It’s:
stored on some server (often a third-party cloud)
maybe processed somewhere overseas
maybe shared with consultants, vendors, or “analytics partners”
sitting inside systems built by companies who might get acquired tomorrow
The laws around this stuff are still fuzzy and half-outdated. Privacy watchdogs are basically speed-running case law just to keep up.
Meanwhile, the tech is already here.
The incentives are huge.
And once data exists, someone is going to monetise it.
Now Add AR/VR and Big Tech Into the Mix
Now jump forward a few years.
Cheap AR / VR headsets with ads built-in are everywhere.
Combine:
in-store tracking (movement + possibly face)
your Tmart loyalty profile
your Google data (search, YouTube, Chrome)
your Meta data (Instagram, Facebook, pixel tracking)
And you get:
you walk past a newsstand in real life
your headset pings:
“Hey Tomas, remember that New Scientist you picked up last week? 15% off at Tmart today.”
your feeds are full of hyper-specific nudges based entirely on things you almost bought
This isn’t “you viewed X webpage, so here’s an ad for X.”
This is:
“We watched you hesitate in front of X in real life
and now we’re going to bombard that weak spot from every angle.”
And as a marketer, I can’t lie:
part of my brain goes, “That would convert like crazy.”
As a human, I go, “This is cooked.”
The Marketer vs The Human
This is where I’m properly torn.
The Marketer in Me:
knows this would be insanely profitable
sees hotter leads and absurd conversion rates
loves the idea of killing guesswork with real behaviour data
can imagine “creatively” weaponising this in funnels
Instead of:
“We think you might want this”
it becomes:
“We know you want this, because you touched it, stared at it, and then put it back sadly.”
From a pure marketing standpoint, it’s genius.
The Human in Me:
hates the idea of my face being stored anywhere
doesn’t want every physical action to be ad fuel
thinks constant behavioural tracking is a step too far
does not want to live in a Minority Report demo for the rest of my life
We’ve already crossed lines online:
cookies
retargeting
fingerprinting
surveillance capitalism in general
Now we’re dragging that logic into physical space.
This isn’t just pixels being logged.
This is your body in the real world being logged.
The Australian Law Bit, In Plain English
Zooming back to the Kmart and Bunnings rulings for a sec, because they’re important.
Under the Privacy Act 1988 in Australia:
Your biometric data (face templates, etc.) is sensitive information
Collecting it requires:
a clear need,
transparency, and
usually explicit consent
There is an exemption for preventing “unlawful activity or serious misconduct.”
Kmart and Bunnings both tried to use that:
Kmart: “refund fraud and safety”
Bunnings: “violent and organised crime, banned people, staff safety”
The regulator basically said:
you scanned everyone
your signage was tiny / unclear
most of the people you scanned weren’t doing anything wrong
the use of tech was disproportionate to the risk
and the exemption doesn’t let you mass-scan faces just in case
So:
Kmart = breach
Bunnings = breach
The Good Guys = paused early after CHOICE complained
Meanwhile, AI systems like Coles’ smart gates and Woolies’ blurred checkout cameras haven’t been found unlawful because they avoid biometrics and/or give people clearer notice and some kind of opt-out.
The biggest takeaway from all this is:
It’s okay to track what happens in a store.
It’s not okay to secretly track who people are.
And right now, the Privacy Act is playing catch-up. We don’t have a shiny, modern “Biometrics Act” or “Retail AI Act.” We just have:
vague 1980s-era privacy principles
judges and commissioners stretching them to fit 2025 technology
and early cases like Kmart/Bunnings that everyone else is now reading as “what not to do”
So Where Does That Leave Us?
This is the world we live in:
There are bad actors, and stores do need security.
Good analytics do help small businesses survive.
Cameras, on their own, are not the enemy.
But linking cameras → faces → identity → behaviour → ads? That’s the line.
Once you attach data to a specific person, you’re not just optimising a layout anymore.
You’re building a profile.
Profiles can be:
sold
leaked
misused
discriminated with
And they will be, if we’re not careful.
So yeah, the nightmare kind of continues forward. It doesn’t stop. It just gets better at recognising your face, remembering what you picked up, and following you around the internet afterward.
If we want the good parts of this tech without living in a retail panopticon, the basic rules should be:
Track behaviour, not identity.
Optimise stores, not people.
Use AI, but don’t build a biometric database of the entire country.
Or, in plain English:
Just because you can scan everyone’s face doesn’t mean you bloody should.
Welcome to the future.
Bring your loyalty card.
Try not to think too hard about who’s watching.



Wow - this is excellent! It needs to
be seen by the stores you mention ! 🤔