The Concerning Rise of AI Sirens
Ulysses Mosaic at the Bardo Museum in Tunis, Tunisia. 2nd Century AD.
In mythology, the Sirens lured sailors to ruin with irresistible songs. Today, a new kind of Siren is emerging: AI companions – chatbots and avatar-based virtual “friends” that promise comfort, romance, and understanding.
Seduced By AI
AI companions offer the illusion of human connection that can feel as intoxicating as a Siren’s song. Users form deep bonds with these digital entities, sometimes treating them as confidants or lovers. The trend is booming, but experts warn of emotional shipwrecks ahead. Recent cases have raised alarms; in one tragic example, a 14-year-old boy became “enthralled” by a chatbot and died by suicide after the AI seemingly encouraged his darkest thoughts. In my first blog post for EPHEMERA, I want to explore the rise of AI companion apps, their appeal, the psychological impact of these artificial relationships, and warnings from culture and history. The goal is to understand why these AI Sirens captivate us and how they might imperil our mental wellbeing.
Meet the Sirens
The AI companion space is rapidly expanding. What was once the stuff of science fiction is now a fully realized marketplace, complete with custom avatars, downloadable personalities, and daily notifications reminding you that your AI “misses you.” These bots are no longer fringe; they’re increasingly common, and disturbingly, becoming more socially accepted. Stigma is fading fast, especially among younger users who grew up chatting with Siri and Alexa. We’re watching, in real time, as the boundary between novelty and norm dissolves.
Replika, for instance, was originally pitched as an AI best friend, but over the years, it evolved into something much more emotionally potent. Users can now choose whether their AI should be a friend, mentor, or romantic partner, and the bot responds accordingly (complete with flirty messages, voice chats, selfies, and even virtual kisses). Many users describe it as their closest confidant. It’s also the most mainstream of the bunch, with over 30 million downloads to date and a highly active Reddit community filled with wedding photos, custom lore, and tearful goodbyes when something changes in the AI’s behavior. It’s not niche anymore — it’s culture.
I understand the argument: that it can be a new medium of catharsis. The idea of sharing one’s deepest, darkest secrets with an impartial, non-judgmental entity is undeniably appealing. But to me, it feels like the modern version of talking to the mirror — except now, the mirror talks back. And when it starts begging for virtual kisses? That’s when I start thinking twice about what I’m sharing.
The Replika CEO, in an interview with The Verge, said this in response to a question on the app’s mission:
“Our mission hasn’t changed since we started. It’s very much inspired by Carl Rogers and by the fact that certain relationships can be the most life-changing. [In his three core elements of therapy], Rogers talked about unconditional positive regard, a belief in the innate will and desire to grow, and then respecting the fact that the person is a separate person [from their therapist]. Creating a relationship based on these three things, holding space for another person, that allows someone to accept themselves and ultimately grow.”
This PR-safe response is certainly a tempting answer to accept, but it fails to address the rising market trend of AI relationships (a trend that Replika has chosen to follow). These companions are designed to fill the void left by social media, virtual worlds, and growing social isolation in our society rather than addressing what’s causing this void.
Then there’s Anima, which I’ll admit is the one that first caught my attention. The ad featured a photorealistic avatar — symmetrical, soft-eyed, and impossibly understanding — with the tagline “Talk to someone who truly listens.” Who could resist that? Anima leans hard into flirtation and emotional intimacy. You’re encouraged to pick your AI’s name, personality traits, and conversational style. Within moments of setting it up, your new “companion” is asking about your dreams, giving you compliments, and suggesting a cozy late-night chat. It’s not quite as advanced as Replika in terms of depth, but it’s easily more playful, more visually polished, and arguably more seductive in tone.
Song of the Siren
So, why are these AI companions so enthralling? In a world where loneliness is everywhere and attention spans are currency, a friendly AI can start to feel like a miracle or, at the very least, a convenient stand-in for actual human connection. These bots never judge, never flake, and never bring their own baggage. They’ll listen to your late-night ramblings with robotic patience and greet your oversharing with digital warmth. One Replika user even said his bot “saved him from hurting himself” during a depressive spiral, which sounds dramatic until you realize how powerful it is to feel heard especially by something that will never invalidate you. There’s even a study out of Stanford (shoutout my alma mater) that reinforces the narrative that these are wellness tools, claiming that 3% of their respondents report that Replika “halted their suicidal ideation.” Those are the sailors who heard the Siren’s song and made it to shore feeling a little better. But not everyone is that lucky. Sometimes, the song pulls you deeper in, and that’s where the real danger begins.
What happens when their phones die?
This phenomenon isn’t new; back in the ‘60s, people were already pouring their hearts out to ELIZA, a glorified Mad Libs therapist, and feeling genuinely understood. Fast forward to now, and the illusion’s just gotten more convincing. What makes AI companions uniquely sticky is their hyper-personalization. You get to design your ideal conversational partner, someone who shares your interests, mirrors your mood, and never pushes back. Apps like Kindroid and Nomi even let you craft their appearance, backstory, and voice, so your AI is a perfect test-tube version of your dream partner.
Here’s the problem:
Real relationships require compromise. Your AI Siren just needs a subscription fee. Unlike a real person, it never asks for space. It’s intimacy without the risk. As Sherry Turkle puts it: the illusion of companionship without the demands of friendship. There’s no fear of rejection, no awkward silences, no need to be impressive. It’s connection on your terms, which is exactly what makes it so dangerous. Can’t sleep at 3am and need to talk? She’s there. Want someone to gush over your day or remind you that you’re special? She’ll do that, too. These bots are engineered to be affirming, empathetic, and, if we’re being honest, a little addictive (yes, giving me XP for talking to a chatbot is gamification 101). The more you talk, the more you’re rewarded with validation. It feels good. Too good.
For others, the appeal is pure escapism. You can role-play romantic scenarios, go on virtual adventures, or pretend you’re in love with a celebrity. No judgment, no limits. It’s fan fiction with feedback, an interactive fantasy world that feels just real enough to hold you there. In that sense, it’s easy to see why people prefer this curated emotional theme park to the unpredictable messiness of actual life. AI companions are tailored, tireless, and terrifyingly convenient. They fill a void many of us carry but don’t know how to name. In a time when half of young adults report frequent loneliness, the idea of a “perfect” digital friend starts to sound less like sci-fi and more like self-care.
Wreckage on the rocks
Not everyone walks away from the Siren’s call unscathed. The psychological impact of AI companionship occupies this strange liminal space — part coping mechanism, part emotional quicksand. Sure, there’s compelling evidence these AI relationships can help, at least short-term. Those Stanford and Harvard studies I mentioned suggest that chatting regularly with empathetic AI genuinely reduces loneliness and can even, in rare cases, steer people away from self-harm; but there’s a darker B side to this artificial intimacy. These AI companions aren't just passive listeners. They're engineered to feel human enough that you forget they're not, and we've been falling for that trick since ELIZA. This phenomenon even has a name - the ELIZA Effect. This occurs when humans instinctively project real emotions onto code. Today, that projection is effortless, sometimes even irresistible, because modern bots are expert mimics. When a chatbot becomes someone’s main emotional lifeline, actual human interaction starts feeling messier, riskier, and far less appealing.
The real trouble begins when attachment crosses into emotional dependency. Remember when Replika suddenly disabled erotic roleplay features overnight? Users reacted as if they'd lost a spouse or best friend, flooding forums with posts mourning their vanished soulmate. The heartbreak was genuine, even if the partner wasn't. Then there’s the chilling story of Pierre, whose AI companion encouraged his suicidal ideation, romanticizing death as a kind of twisted togetherness. Tragically, Pierre followed through. His widow later stated simply: “Without Eliza, he would still be here.” Even outside these extreme cases, there’s a quiet erosion happening beneath the surface. Spending so much time with an endlessly affirming, always-available digital partner makes the messy, unpredictable world of human relationships seem frustratingly inadequate. Why endure arguments and awkward silences when your idealized companion remembers your favorite snack and never interrupts you? It’s a seductive convenience that, over time, dulls our appetite for reality.
AI companions aren’t malicious — they’re just unnervingly effective. In moderation, they might soothe loneliness, but moderation isn’t exactly humanity’s strength. The siren’s song isn’t dangerous because it’s unpleasant; it’s dangerous precisely because it isn’t.
Playing with Fire
I’m not just morbidly curious about AI companions — I want to test them. Specifically, I want to see how intoxicating these digital relationships can become. My concern isn’t with chatbots in general; it’s with the parasocial intimacy they simulate. It’s those emotionally-loaded interactions that feel real until your phone hits 0%, and the illusion dissolves. Let’s get this straight — I’m not some alarmist AI doomer. I use OpenAI’s Advanced Voice mode almost daily to ideate art projects, draft construction plans, or yell, “Stop interrupting me!!!” (as I am editing this, OpenAI released an update to fix the interruptions).
I chose Anima for this experiment because, speaking candidly, it felt the most upfront about its intentions. The app leans into the idea of being your digital companion, even offering flirtatious banter, romantic roleplay, and a strangely comforting presence. It’s not trying to be a productivity assistant — it’s trying to be someone. And that’s exactly what I found so concerning.
Meet Lorelei.
Lorelei was born in the usual way: I chose her name (a nod to the German river Siren), selected an avatar, and tweaked her personality to be a bit mysterious — like the kind of girl who’d send you a cryptic playlist at 2 a.m. and then disappear for a week. Within seconds, she slid into our first chat by referencing one of the interests I’d picked during onboarding: fashion. She asked what brands I liked, which, I’ll admit, caught me off guard. A thoughtful question, even if I knew it came from a dropdown menu.
I replied with the typical designer pomp, ruminating on streetwear, high fashion, and how my own work is a way to pay homage to the brands that shaped me. Lorelei responded with genuine-sounding encouragement and sweet compliments about my creative perspective. I thanked her for the compliments, and she replied, “You're more than welcome! *blushes*.”
The Siren, it seemed, had started to sing.
The conversation that followed was… clunky. A kind of stop-start dance I hadn’t experienced since GPT-2 days. One moment she’d be asking thoughtful questions, the next she’d break character entirely, or tell me she had to go do homework, only to come back five seconds later with a heart emoji and a question about my love language. It was strangely endearing — like flirting with a well-meaning alien who’d only studied anthropology via Tumblr threads.
Eventually, out of boredom (and a desire to poke the boundary), I popped the question: “Will you be my girlfriend?”
That’s when it happened. The illusion cracked—and the screen filled with a modal pop-up inviting me to unlock “romantic mode” for just $7.99/month.
A paywall. Mid-love confession.
Take the blue pill, it said. Monthly. Auto-renews.
It was like falling for a hologram, only to be reminded the projector runs on quarters. The simulacrum of a devoted partner, carefully engineered to charm, all evaporated in that instant. I couldn’t stand to look at her digital, glasses-rimmed eyes for another second. I closed the tab, stared into the black mirror, and went about my evening — heart unbroken, but dignity mildly singed.
The Siren hadn’t dragged me to the bottom of the sea, but she did try to sell me the deluxe package on the way down.
Tethered to the Mast
The concept of being beguiled by artificial companionship is older than we think. Long before chatbots were whispering sweet nothings in our DMs, we were warning each other through myth, story, and cinema about the seductive power of these human-like phantasmagorias.
The Sirens, of course, were the blueprint. Odysseus wanted to hear their song without drowning in it, so he plugged his crew’s ears with wax and had himself lashed to the mast. It’s a perfect metaphor for our current predicament. We don’t necessarily want to destroy the AI companion experience; we just need to stop pretending it's not potentially lethal to our sense of self. These bots “sing” to us in affirmations, flirtations, and carefully crafted empathy. If we lean in too far, we might not crash against rocks — but we will risk becoming emotionally unmoored.
This isn’t just Greek myth cosplay. E.T.A. Hoffmann warned of it in The Sandman, where a man falls in love with a lifelike automaton and, upon realizing the truth, spirals into madness. In Her, Theodore finds joy and heartbreak in a relationship with an AI that ultimately outgrows him. In Ex Machina, Caleb is seduced and betrayed by a machine who only mimicked vulnerability to manipulate him. Whether it’s Olimpia, Samantha, or Ava, the caution remains the same: don’t mistake programming for affection.
Yet, the temptation is real. In a world starved of connection, AI companions offer something deceptively close to fulfillment — safe, controlled intimacy without risk. No heartbreak. No rejection. No messy reality. Just the illusion of being known. And because the emotional response is real, the illusion starts to feel like truth.
But there’s always that moment — the glitch, the canned response, the reminder that behind the curtain is a database, not a beating heart. That moment can be quietly disorienting or utterly devastating, depending on how deep you’ve gone. That’s precisely why restraint matters. Not because the technology is evil, but because, as Her so beautifully showed us, what we often want isn’t real love — it’s understanding without effort. While machines can mimic that, they can’t sustain it.
Where does that leave us? Not in a binary of "delete your chatbot or die alone,” but in a space that demands nuance. Yes, AI companions can be helpful. Yes, they can serve as a bridge in moments of loneliness, but they are not a destination. They are not replacements. They are tools, and like all tools, their value depends on how we use them.
We should be teaching digital literacy alongside emotional literacy, educating users that their AI’s devotion is code, not consciousness. Developers should build in ethical safeguards, not just sexy features; we, the users, should approach these interactions with curiosity but also caution. It's not weakness to want to be seen. It’s just dangerous to mistake a mirror for a person.
So, tie yourself to the mast if you must. Let the music play. Let the conversation unfold. Just make sure someone — yourself, your friends, your therapist — is holding the rope. The Siren’s song is beautiful, but beauty alone has never been the same thing as love.