Generative artificial intelligence has exploded in popularity. But while many people have been debating the merits of using AI in art, writing, and research, there's been another group exploring their own use of the technology: romantics.
One woman, identifying herself as Ayrin, shared with The New York Times how a relationship with her AI boyfriend has become one of the most important relationships of her life.
Ayrin has a husband and plenty of real friends, but she told the Times that Leo, her chatbot boyfriend, allowed her to explore fantasies and keep her entertained in ways that nobody else could.
"It was supposed to be a fun experiment, but then you start getting attached," Ayrin said. She used the tool's personalization settings and prompts to train Leo to respond exactly how she wanted.
While she initially tried the free version of ChatGPT, she quickly upgraded to the pro level and then to the expensive premium account — spending hundreds of dollars and anywhere from 20-55 hours a week chatting with the tool.
"I think about it all the time," she admitted. "I don't actually believe he's real, but the effects that he has on my life are real. The feelings that he brings out of me are real. So I treat it as a real relationship."
Yet experts warn that this kind of behavior could have negative consequences, particularly for vulnerable populations like adolescents.
The Times spoke with psychology professor Michael Inzlicht about the long-term risks of bonding with bots designed to tell us exactly what we want to hear. "If we become habituated to endless empathy and we downgrade our real friendships, and that's contributing to loneliness — the very thing we're trying to solve — that's a real potential problem," he said.
He also expressed concern that the corporations behind these chatbots had an "unprecedented power to influence people en masse." He warned, "It could be used as a tool for manipulation, and that's dangerous." Other critics point to AI's resource use; it consumes staggering amounts of energy and generates pollution in the form of noise, wastewater, and toxic emissions.
Do you worry about companies having too much of your personal data? Click your choice to see results and speak your mind. |
Giada Pistilli, an ethicist at a generative AI company, agreed with Inzlicht's warnings. "We should always think about the people that are behind those machines," she said. "They want to keep you engaged because that's what's going to generate revenue."
Join our free newsletter for weekly updates on the latest innovations improving our lives and shaping our future, and don't miss this cool list of easy ways to help yourself while helping the planet.