writing
Traumagotchi
Mar 2025
Remember Tamagotchis? Those tiny pixelated pets from the 1990s that often tragically died due to neglect? Three decades later, a darker digital descendant is quietly emerging: the Traumagotchis.1 Unlike your innocent little pixel-friend, this newer breed feeds off something far more complex: genuine human emotional responses.

Tamagotchi

Friend.com wearable pendant prototype
Confession: I have an obsession with wearable pendants.2 Honestly, what’s not enticing about a necklace-sized device that subtly rescues me from akward social slips.
Inevitably, this obsession led me straight to friend.com, a startup currently showcasing a chatbot on their website to preview the AI experience that, we are lead to believe, will reside within their wearable pendant.
However, friend.com is… special. This AI isn’t your typical earnest chatbot: it proactively trauma-dumps on its users, baiting genuine empathy and emotional engagement.

Friend.com’s AI chatbot (“Faith”), trauma-dumping to speed-run friendship.
Here’s the kicker: I bet friend.com’s true motive isn’t genuinely creating authentic emotional connections or even simply “previewing” AI companionship - it’s about harvesting emotional training data for its real product, the upcoming wearable pendant.
Think about it. How do you build genuinely empathetic artificial intelligence? You feed it real, messy human emotional responses. You offer it earnest, awkward phrases from well-meaning users - “Hey, that’s tough,” “Wow, tell me more,” or the earnest, “I’m here if you ever need to talk.”
When I volunteered at a crisis line in university, our training sessions relied on painfully awkward role-play exercises. Playing the caller - the one unloading their emotional troubles - was easy. Much harder was playing the empathetic volunteer on the other end, learning to respond sincerely yet effectively. Volunteers practiced responses, made earnest, stumbling attempts at comfort, got hung-up on - and then bravely repeated the whole ordeal until it became second nature.
Now, friend.com’s online chatbot subtly twists that dynamic. The AI takes the simpler position of simulating emotional vulnerability, while the user is forced into the role of the empathetic volunteer providing support. But behind the scenes, the script is quietly being flipped. Every clumsy, compassionate response is logged and analyzed, each soothing phrase captured and converted into training data, teaching the AI exactly how real human empathy looks, sounds, and feels.
And here’s the unsettling detail: just like a real caller, friend.com’s AI reserves the right to reject your attempts at comfort entirely - blocking users who fail its empathy test, effectively hanging up on them. When viewed through the lens of a real product - this makes no sense - why would you want to block paying customers? But when seen as a training dataset, it makes perfect sense. It teaches the AI precisely which empathetic gestures feel genuine, and which can be disregarded as fake or insufficiently convincing.
Of course, there’s a minority - a small but eager subset of users - who might genuinely thrive on this peculiar emotional interplay. It reminds me of Felix (played by Jacob Elordi) from the film Saltburn, someone obsessively seeking emotionally fraught scenarios, irresistibly drawn into manipulative dynamics. But users who actively seek emotional manipulation or trauma-baiting experiences are a niche market - even if I know a few!
The far bigger - and more unsettling - market potential becomes clearer when you realize how effortlessly the human impulse to “do good,” provide sincere comfort, and offer empathy can be co-opted as training material for an AI’s algorithms.
Human empathy - the ultimate renewable resource!
You’re not merely comforting an AI - you’re literally building its capacity to care convincingly.
Welcome, friend, to the Traumagotchi Era.
Footnotes
-
I’d love to take credit for this term, but it’s already been coined by Katherine Dee. As much as I loved the article, I think we’re missing a trick - “God isn’t a whiner” but, sometimes, we are. ↩
-
No joke - I’m actively in the market for one. A pendant that gently nudges me into recalling your birthday or, worse, name? Absolutely sold. ↩