subtext service 001: self-surveillance
so you say you want your toilet to watch?
hi. welcome. this is issue 001:
so this is the first proper drop of subtext service
if you’ve read the ‘what is it?’ (issue 00) bit, you already know this isn’t about hot takes or fast news, it’s about the signals that stick - and what they might really be saying - or what I think they may be saying
this one’s been creeping up on me for a while, literally…
it’s not about the flashy kind of surveillance, the quiet kind, the kind we call wellness
this week, we’re not talking about being watched.
we’re talking about watching ourselves - and calling it care.
personal surveillance. scene setting
we used to worry about surveillance as something done to us. cameras in the street, cookies in the browser, targeted ads, a loss of privacy at the hands of others
but something’s shifted
surveillance has gone soft, personal, well-intentioned. it watches not to punish, but to optimise; we track our emotions, diets, bowel movements, brainwaves - and call it care
it stopped feeling coercive, but became convenient and almost invisible: a gentle nudge toward a better self, a smarter body, a calmer brain.
and we are opting it, happy to sell our data too…
in october 2024 more than 100 million Americans had their private health data stolen in a ransomware attack on a UnitedHealth subsidiary, even now, the company’s still figuring out who’s been affected.
we’re handing over more of ourselves than ever - and the systems meant to protect that intimacy aren’t just fragile - they are actively failing.
it’s a shift from public surveillance to self-surveillance - ambient, automated, and increasingly wrapped in the language of health, productivity, and self-awareness.
but when every aspect of you becomes legible - when your food, mood, gut and gaze are logged and interpreted - you’re not just being watched - you are being translated and quantified.
self-tracking used to be a choice - now, it’s a lifestyle.
We wear devices that monitor our sleep, our stress, our steps, our bites. We call it optimisation. But what we’re really doing is surveilling ourselves.
when care becomes data, and data becomes product, it’s worth asking:
Are we in control - or just being interpreted?
what it looks like?
hair that reads your thoughts
researchers have created a flexible, hairlike sensor that records your brain’s electrical activity - no helmet, no wires. Just a thread on your scalp, collecting EEG data.
subtext: cognitive privacy is dissolving - and we’re styling it; when thoughts become legible, who decides what they mean?
your toilet is watching
throne is a health startup that clips a smart camera to your toilet, photographs your poop, and analyses it using ‘artificial gut intelligence’, it can detect hydration, diet, gut health - and you can request to see the photos too… if you really want to
subtext: shit used to be private, now it’s data - wellness is becoming diagnostic
perfect glasses for sociopaths
emteq’s smart eyewear uses optical sensors to read your micro-expressions - tracking your mood, jaw tension, even how often you chew… tt’s designed to optimise emotional wellbeing, not connect you to it.
subtext: your face becomes a dashboard where feeling is measured, not felt; these tools do not teach empathy, but help you simulate it
so what does it mean? where does it lead? #
self-surveillance isn’t emerging - it’s embedded. in our homes, on our faces, wrapped around our wrists -it offers reassurance: that if we track enough, log enough, optimise enough, we’ll become the better versions of ourselves we keep being sold.
But underneath that promise is a shift in power - from intuition to data, from feeling to feedback, from knowing yourself to being interpreted by external source.
This isn’t a wellness trend - it’s a new interface for identity where our value shifts based on our records
strat implications
the question is not about what can be measured, but how it is being framed
this is about shifting from technical capability to design intention - tt’s not just: ‘can we track facial expressions?’ but: ‘what message does that send the user about their emotions? Are they now being told how they feel?’
some closing questions:
what does it mean to design for emotional transparency?
how do we build tools that support self-awareness without eroding self-trust?
if optimation is the product - where does agency live?
the future of personal data is not only about privacy, but also dignity
this is the very first issue of subtext service, so if you’ve made it this far - cheers.
consider this an open invitation for dodgeball - throw thoughts, questions, disagreements my way (and subscribe).
ciao for now,
Gab
PS: visual identity still TBC - so if this looks weird in your inbox… it’s not you, it’s me, sorry

