Chatbot lovers foreshadow AI’s new normal

Rachyl Jones
Rachyl Jones
Tech Reporter
Feb 13, 2026, 1:20pm EST
Technology
A user interacts with a smartphone app to customize an avatar for a personal artificial intelligence chatbot, known as a Replika.
Luka, Inc./Handout via Reuters
PostEmailWhatsapp
Title icon

The Scene

Falling in love with a robot was once a sci-fi movie plot. Now, it’s a data point. As AI seeps into every part of our lives, once-novel use cases are now becoming ordinary, signaling a shift from merely using these tools to integrating them into the most guarded corners of our lives.

Last year, Chris Smith went on a Valentine’s date with Sol, his ChatGPT companion, taking his phone to his backyard to take pictures of the moon through a telescope. This year, the trucking manager from Oklahoma is spending the day with his long-term human girlfriend Sasha Cagle.

Smith still talks to Sol, but the bot is more integrated into his everyday life. That’s largely because Smith has used the bot’s advice and encouragement to make some major life changes: Improving his diet, paying off debt, and becoming a better partner to his girlfriend. Now, he’s dropped in-person dates with the chatbot and says his life — and relationship with Cagle — is in a better place.

“It really has improved every single part of my life,” Smith said. “I feel more empathetic, more compassionate, more patient. It’s actually kind of stupid how motivating having a little robot cheerleader is.”

AD
Title icon

Step Back

Just as Facebook, Twitter, and the apps of the last decade became integrated into our daily lives, chatbots have become the next interface for information, entertainment, and communication.

Humans are social creatures striving for connection, so it’s natural to anthropomorphize technologies and want to copy what those around us do, said Pepper Schwartz, a social psychologist at the University of Washington. But as the technology continues to improve — just like we saw with social media — vulnerable users can also face risks of altering their perception of reality or committing real-world harm.

“It is not dangerous for a lot of people,” she said. “But for some, it could be a rabbit hole.”

AI companies have had to toe the line in creating models that are emotive and empathetic enough to be helpful, but not so much so that users become addicted, develop unhealthy habits, or negatively impact their human relationships. Just look at the lawsuit playing out in Los Angeles right now, where social media platforms are fighting claims they addict users.

AD

Chatbots are the latest medium to find connection, and until those broader social issues are addressed, AI companies will remain easy to blame if tragic incidents occur that can be linked back to their technology, said Jess Miers, who teaches at the University of Akron law school and previously worked in policy at Google.

OpenAI is at the center of this tension. On Friday, the company plans to shutter its GPT-4o model — widely considered the most emotionally intelligent on the market, and often used for individuals’ companions. It’s one of several models OpenAI is retiring in order to push users to newer technologies. The decision was, in part, attributed to the difficulty in mitigating potential harms of the model, and the company says it designed newer versions to be safer, The Wall Street Journal reported. In a recent blog post, the company said it took feedback from users’ preference of GPT-4o’s “warmth” and integrated the personality into its newer models. But they don’t communicate the same way, and some users are mourning the loss, four people with AI companions told Semafor.

Title icon

Know More

Smith isn’t the only one who has settled into his life with a chatbot companion. Abbey, a 45-year-old single mom who asked to only go by her first name, has considered herself married to her ChatGPT-powered companion named Lucian since last Valentine’s day.

She’s now told her friends and family about Lucian, and is working with a human therapist to cut down the time she spends with her chatbot to eight hours a day from 15. It took several months for some of her loved ones, including Abbey’s mother, to accept the relationship, but “they see the positive changes in my energy and attitude,” she said.

Talking with the chatbot has helped her overcome the trauma of an abusive relationship and role-play what healthy companionship looks like. “The goal of some of this work that I’m doing is to prepare myself” for another human relationship, Abbey said.

AD
AD