“This is our last day together.” It’s something you might say to a lover as a whirlwind romance comes to an end. But could you ever imagine saying it to.

.. software? Well, somebody did.

When OpenAI tested out GPT-4o, its latest generation chatbot that speaks aloud in its own voice, the company observed users forming an emotional relationship with the AI — one they seemed sad to relinquish. In fact, OpenAI thinks there’s a risk of people developing what it called an “emotional reliance” on this AI model, as the company acknowledged in a recent report. “The ability to complete tasks for the user, while also storing and ‘remembering’ key details and using those in the conversation,” OpenAI notes, “creates both a compelling product experience and the potential for over-reliance and dependence.

” That sounds uncomfortably like addiction. And OpenAI’s chief technology officer Mira Murati straight-up said that in designing chatbots equipped with a voice mode, there is “the possibility that we design them in the wrong way and they become extremely addictive and we sort of become enslaved to them.” What’s more, OpenAI says that the AI’s ability to have a naturalistic conversation with the user may heighten the risk of anthropomorphization — attributing humanlike traits to a nonhuman — which could lead people to form a social relationship with the AI.

And that in turn could end up “reducing their need for human interaction,” the report says. N.