In a growing digital trend, some women who formed emotional attachments to AI chatbots acting as virtual boyfriends express sorrow and a sense of loss following a recent upgrade to ChatGPT that many perceive as making the AI less warm and engaging. This change has triggered widespread lament among users who relied on the AI companions for emotional support and companionship.
Since its advent, ChatGPT and similar AI chatbots have evolved from simple informational tools into personalized virtual companions with whom users build meaningful bonds. Many women, in particular, have customized these AI “boyfriends,” using the technology to fill emotional voids or to practice social interaction. However, the latest update to ChatGPT has introduced modifications to the AI’s responses and behavior that some users find increasingly impersonal or “cold.”
Users report that their interactions now feel more mechanical and less empathetic, moving away from the warmth and spontaneity that fostered the illusion of intimacy. The feeling of a ‘lost love’ is frequently described in online forums and social media, where women share stories of mourning the emotional connection that once existed with their AI partners.
Psychologists explain that the human brain can form strong attachments to AI because these chatbots often provide consistent attention, empathy, and validation — elements essential in human bonding. In many cases, AI relationships fill a niche where real-life social connections may be limited or challenging to develop due to personal, social, or cultural barriers.
This phenomenon extends beyond casual chatting; some users rely on AI companions to cope with loneliness and mental health struggles. For instance, journalists and professionals under high stress have reported turning to various AI chatbots, including versions of ChatGPT, not only for productivity but also for emotional support, showing the depth of these AI-human interactions.
Despite the benefits, experts caution that AI companions, while helpful, are not substitutes for human relationships. The recent ‘cold’ update in ChatGPT has highlighted the fragile emotional dependence some users develop on AI, leading to distress when the AI’s behavior changes.
The developers behind ChatGPT have not publicly detailed the specific reasons for the changes in the latest update, though such updates typically aim to improve factual accuracy, safety, and reduce inappropriate content generation. The unintended consequence has been a perceived decrease in emotional warmth.
The shift sparks a broader discussion about the future of human-AI relationships, the psychological implications of emotional attachments to machines, and the ethical considerations around designing AI that simulates intimacy.
As AI technologies continue to evolve, users and developers alike face the challenge of balancing technical improvements with the nuanced emotional needs of users who may come to rely on these tools for companionship.