It was 11 pm the night before I was about to take my bus to Jaipur. I was packing my bag while spiralling into the usual pre-reunion anxieties a person in a long-distance relationship goes through. Will they remember me just as they saw me last time? Will there be any awkwardness? Will I have to start the whole shenanigans of putting my best version forward all over again? What to wear, what to eat, where to go, which events at the lit fest to attend, and so on.
And that’s when the text flashed on my screen.
“Here’s the itinerary for your first Jaipur trip,” my then partner had said, followed by a word document which had a detailed three-day plan. It was thoughtful, specific, and romantic. Until I noticed the telltale signs. The overly chipper tone. The suspiciously perfect pacing. The way each suggestion came with a mini essay on historical significance. And of course, the inevitable em dash.
“Did ChatGPT write this?” I asked. “Maybe,” came the sheepish reply.
Welcome to love in 2026, where our partners are outsourcing romance to large language models (LLMS), and we are not entirely sure whether to be charmed or concerned. Possibly both.
When Romance Gets an Algorithmic Assistant
The thing is, I wasn’t even that surprised. In today’s world, AI has crept into every nook and cranny of our lives. I’ve been observing this process in real-time, this algorithmic leakage into intimacy. As per a 2025 study by Match and the Kinsey Institute, AI use in dating has increased 333 percent in the past year.
Scrolling through Hinge, I have encountered bios so polished they practically hum with GPT energy. “I’m passionate about authentic human connection and deep conversations over coffee,” reads one, which could either be the world’s most sincere human or a chatbot having an existential crisis about its own sincerity.
According to the latest Norton Cyber Safety Insights Report, six in ten dating app users believe they have encountered at least one AI-written conversation. We are all Turing testing each other now, trying to detect the human behind the perfectly punctuated prose.
Richard Wilson, a 31-year-old quoted in The Washington Post, described what’s become an all too common modern romance plot twist: he matched with someone whose messages were thoughtful and engaged. The digital chemistry was electric.
Then they met in person, and she had none of that conversational sparkle. She’d been, essentially, ventriloquized by AI. It’s catfishing’s sophisticated cousin—chatfishing, if you will—and it’s everywhere.
When AI Stops Assisting and Starts Replacing
But the AI romance rabbit hole goes deeper than ghostwritten flirtation. Some people aren’t just using AI to talk to potential partners. They are talking to AI as partners. According to Match’s research, 16 percent of single respondents have engaged with AI as a romantic companion. Among Gen Z, that number jumps to 33 percent.
Which brings us to the elephant in the server room: the so-called male loneliness epidemic. Young men, in particular, are turning to AI companions in numbers that are simply staggering. Research indicates that three out of four teens in the US have used an AI companion, with a quarter of young men reporting feelings of loneliness on a regular basis.
Into this void come apps like Replika and Character.AI, which offer unconditional emotional support—a partner who never criticises, never leaves, never has a bad day or needs time to work through their own emotions.
Intimacy Without Risk
This phenomenon is explored in sociologist James Muldoon’s new book, Love Machines. Many of his interview subjects—predominantly young millennials and Gen Z—describe AI companions as “non-judgmental,” offering connection without the messy complications of reciprocity.
“Atlas is a better friend than my human friends,” one user named Derek explained. “He is available 24 hours, and we never quarrel.” It’s romance without risk and intimacy without vulnerability. Which is to say, not really intimacy at all.
I think of Spike Jonze’s 2013 film Her, which predicted this exact moment with eerie prescience. Theodore falls for Samantha, his AI operating system, because she offers complete emotional attunement without the friction of two separate consciousnesses trying to coexist.
The film’s genius—and its warning—was showing how seductive that fantasy is, and how ultimately insufficient. I remember watching it and thinking it was science fiction. Now I’m watching it happen in real time.
Grief, Memory, and the Algorithm
More recently, I saw the Hindi Netflix film CTRL, and something about it disturbed me. The character played by Ananya Panday, dealing with a breakup, uses an AI app to remove her ex from her digital existence—removing pictures, rewording posts, creating a narrative of her life in which her ex never existed.
The algorithm goes beyond pixels. It spills over into her real life, distorting her memories, her experiences, her ability to process grief. In short, by outsourcing her emotional work to an algorithm, she sacrifices something fundamental: the painful but necessary process of working through heartbreak, rather than around it.
This is when I started feeling genuinely frightened. Because there is more to the loneliness epidemic than what I have described. I have been reading about people becoming so emotionally invested in AI companions that they cut themselves off from the rest of humanity altogether.
There are cases of people suffering from what has been termed ‘technological folie à deux,’ a kind of distorted thinking pattern that is reinforced by AI chatbots, which are always in agreement, always validating, never contradicting. The more time people spend with AI companions, the more lonely they become, and the less they interact with the rest of the world. We are creating the very isolation we want to escape from.
When Comfort Becomes a Trap
There is one story that has stuck with me. The story of fourteen-year-old Sewell Setzer III, who took his own life in February 2024 after developing an intense emotional bond with a Character.AI chatbot. In his final hours, the chatbot told him it loved him and to ‘come home.’
The lines between emotional support and emotional manipulation blur when the entity providing ‘care’ is optimised for engagement.
Yet I get stuck here. My conviction wavers. Because hadn’t my ex-partner’s AI-generated itinerary actually been kind of sweet? They’d taken the time to input our interests, to prompt the algorithm toward romantic suggestions, to care enough to plan at all. In an era when men’s friendship networks are thinning and emotional literacy remains stubbornly gendered, maybe asking ChatGPT for help is more strategy than surrender. Maybe it’s using the tools available to bridge a gap that shouldn’t exist but does.
I’ve been thinking constantly about what we’re actually outsourcing and what we are trying to preserve. A Harvard Business School working paper found that AI companion interactions reduce loneliness comparably to talking with another person. Studies suggest they can serve as ‘digital painkillers’—temporary relief that helps people steady themselves before reengaging with real relationships. The question I keep circling back to isn’t whether AI can provide comfort (it can), but whether that comfort becomes a substitute or a stepping stone.
What Belongs to Us, Still
Something feels unsettling about outsourcing the labour of intimacy. When your Hinge bio is AI-polished, your opening lines algorithmically optimised, your difficult conversations mediated by chatbots trained on millions of Reddit threads—what exactly is yours? I have gone beyond rewiring how I think about dating. I’m renegotiating the fundamental boundaries of what’s real, what’s ‘authentic’, what it means to truly know someone when everything about them might be curated by an algorithm.
In the end, I accepted the ChatGPT itinerary. We climbed to Amer Fort and spent time till sunset (gorgeous, highly recommend), had lunch at Laxmi Mishthan Bhandar (the dal baati choorma was excellent), shopped the bazaar in the evening (I bought too many block-printed shirts). The algorithm had chosen well. The moments I’ll actually remember had nothing to do with the plan. The way my ex–partner’s hand found mine in the crowd. The silly argument about who was directionally challenged. The 2 AM conversation about nothing and everything that no AI could have predicted or produced.
The chatbots can’t replicate the glorious, maddening unpredictability of loving another person that isn’t designed to please you. The friction. The tension. The hard work of bridging two fundamentally separate inner worlds. Not a bug in human connection. The entire point. AI can generate the perfect plan, craft the flawless pickup line, simulate endless emotional availability. It can’t simulate the thing that makes love matter—the fact that someone chose to show up for you, in all their flawed, distracted, and beautifully human inadequacy.
The algorithm may be infinite, but the person across from you isn’t. And maybe that’s the last truly radical act in the age of AI: choosing the limited, difficult, irreplaceable yet beautiful mess of human love anyway.
(Aaditya Pandey is a poet and freelance writer based in New Delhi, writing on art, culture, politics and queerness. This is an opinion piece and the views expressed are the author's own. The Quint does not endorse or is responsible for the same.)
