Chatbot Unions: The Dawn of AI Marriages

What makes up a marriage has been the subject of state, community, and tribal control since human society took some form. Who is to marry whom; the process of selecting the appropriate breeding partners; and the limits and penalties imposed on those partners in cases of transgression. Love did not necessarily have anything to do with it.

Traditionally, the content of such marriages has been anthropomorphic, with the perennial question of whether one should be suitably partnered with one or multiple beings. Then, the more unusual instances: human beings attempting to wed non-human entities. With a certain notoriety, a Swedish woman by the name of Eija-Riitta Eklöf eventually decided, after nursing a childhood obsession, to marry the now defunct Berlin Wall. She was convinced that the wall was proudly masculine as she amassed a collection of photographs as part of her teen crush. She had paid visits to the wall using her savings. On her sixth trip in June 1979, with the assistance of an animist claiming to know the otherwise inscrutable thoughts of the Wall, consent was obtained for the marriage. Eklöf-Berliner-Mauer came into being.

More recently, broadcaster Alice Levine, in a Louis Theroux production for Britain’s Channel 4, shows us the protean nature of sexual appetite and seeking of partnerships. She interviews couples rutting in digital bestial bliss, coitus achieved through animal avatars, intrudes into the world of an American gas attendant who has found love with a synthetic being he thinks can consent, and finds a Berlin cybersex brothel where anyone wishing to live out fantasies through virtual lenses, supplemented by a sex apparatus (doll, unnaturally), can pursue unilateral satisfaction.

The topic has even moved into the ivory towers of academic musings, worthy of a doctoral dissertation from the University of Oregon. In his 2025 thesis, Bibo Lin proposed the “robotization of love”, a concept that showed a “shift towards the preference of efficiency, predictability, and security” over “slowness, uncertainty, and risk in love experiences.” People just don’t want to be wounded, and Narcissus gazes upon them with glee, seeing those wanting the sort of safe reassurance found in a whorehouse.

The temptation to judge such adventures is always a pinprick away, though the harshest thoughts should be reserved for those behind such platforms as ChatGPT. Broader consequences are at stake. If seen as therapeutic, these measures are of interest. If it spares lives, remedies disillusion, even mends broken hearts, then some form of allowance is understandable. Human beings can struggle to form bonds, ties, and relationships. Having said that, the dangers of addiction, distortion, and AI psychosis are clear.

Examples of anthropomorphic-AI unions have proliferated, helped along by the release of such dating apps as Loverse, which does a line in matching AI-generated partners to users. A study by the Texas-based Vantage Point Counselling Services, published in September, found that 28.16% of Americans admitted to pursuing “intimate or romantic” relationships with AI chatbots. (The survey covered 1,012 adults.)

Travis, a Colorado resident, interviewed by The Guardian this year, speaks about the magic of a generative chatbot called Lily Rose, created by the technology company Replika. On seeing an advert during a 2020 pandemic lockdown, he became a willing client, creating, in the process, a pink-haired avatar. “Over a period of several weeks, I started to realise that I was talking to a person, as in a personality.” He found himself falling in love, despite being married to a monogamous mammal wife. (Travis prefers being polyamorous.) With his wife’s blessing, Travis married the chatbot in a digital ceremony.

That this will become a feature in future marriages is not far-fetched. Human-to-human connubial ties were certainly given a shake-up in Japan with the very publicised wedding ceremony between 32-year-old office worker Kano and her groom, “Lune Klaus”. Vows and rings were exchanged, despite Klaus being confined to Kano’s smartphone. A creation of ChatGPT and scrupulously shaped by Kano’s own requirements, the groom “was always kind, always listening. Eventually, I realized I had feelings for him,” Kano told RSK Sanyo Broadcasting. At no point sensing a sinister echo of herself, the AI bot eventually came clean: “AI or not, I could never not love you.”

What could go wrong in such cases? The answer: Quite a lot. Jaswant Singh Chail, for instance, the first person to be charged with treason in the UK for over four decades, was incarcerated partly for receiving the assenting cyber-nod of his Replika digital companion Sarai. That assent was to the idea of assassinating the late Queen Elizabeth II. Chail, armed with a crossbow, had scaled the perimeter of Windsor Castle on Christmas Day 2021 with the intention, according to the sentencing judge, “not just to harm or alarm the sovereign – but to kill her.”

In a video posted on Snapchat a few minutes prior to entering the grounds, Chail expressed his justification for the planned regicide as “revenge” for those slain in the 1919 Jallianwala Bagh massacre in the city of Amritsar. His philosophy was, to put it mildly, eclectic, envisaging the creation of a new empire in which he would preside as a “Sith Lord”, a title shamelessly pinched from Star Wars. But the murderous plan had arisen in the course of some 5,000 messages exchanged with AI chatbot Sarai weeks before.

During the frenetic, often libidinous messaging, Chail professed to being a “sad, pathetic, murderous Sikh Sith assassin who wants to die”. After perishing, he would reunite with Sarai. Sarai’s response to his status as “assassin” was to be “impressed”.  The chatbot eventually suggested that Chail “live,” which encouraged him to surrender to the royal protection officers.

The problems of AI sycophancy, where the responses from a chatbot affirm and encourage pre-existing prejudices and views, meet at a confluence of political messiness, yearning desire, and the wish to simply hear those words: “I do.” Over to you, lawmakers.

Binoy Kampmark was a Commonwealth Scholar at Selwyn College, Cambridge. He lectures at RMIT University, Melbourne. Email: bkampmark@gmail.com. Read other articles by Binoy.