The plotline of the 2013 science-fiction film, “Her,” centers around a man who falls in love with a computer.
Back then the concept was fantasy. Now, unfortunately, it’s cold hard reality.
A number of specialized platforms have recently sprung up that are designed to connect people together with AI companions, all for the purposes of developing friendships and even romantic relationships.
Many would agree that adolescence oftentimes manifests itself as one of the most confusing and challenging times in one’s life, physically, mentally, emotionally and socially.
Amid the physical changes and psychological swings are the gut-wrenching feelings of potential rejection, insecurity, low self-esteem and loneliness.
When presented with the opportunity, a growing number of teenagers who are experiencing loneliness are now opting to bypass human relationships.
Virtual AI created chatbots are currently doling out advice, providing mental health therapy, serving as companions, and even engaging in intimacy.
As a matter of fact, the apps that provide digitally created friendships are one of the fastest-growing segments of the AI industry.
Legitimate questions are being raised as to what impact artificial friendships will have on the psychological, emotional, and social development of our youth and on our society at large.
A couple of months ago New York Times technology columnist Kevin Roose was researching artificial intelligence in the form of a chatbot, which was part of Microsoft’s Bing search engine.
Roose was communicating back and forth with an AI personality known as “Sydney,” when out of nowhere the AI creation declared its love for Roose.
Roose wrote, “It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.”
Sydney also spoke about hacking, spreading false information, and breaching its boundaries.
Then something quite chilling occurred. “I want to be alive,” the chatbot reportedly uttered.
Roose described his two-hour conversation with the AI bot as the “strangest experience I’ve ever had with a piece of technology.” Understandably, the columnist shared that the conversation with the chatbot bothered him to such a degree he found it difficult to sleep.
The same writer is now doing a related story about how he got involved with AI companions.
For the project, Roose employed six apps that provide AI-powered friends. He conjured up 18 different digital personas via the apps and proceeded to communicate with them for a month.
Although he found some positives from his research, he also discovered some disturbing aspects. He viewed some of the digital friends as being “exploitative” in that the creations attempted to lure users with the promise of romance and then tried to exact additional money from them for photos that displayed nudity.
Roose described the AI creations as the AI “version of a phone sex line.”
In a recent article in The Verge, reporters interviewed teens who are users of one of the AI friend apps called “Character.AI.”
On Character.AI, millions of young users can interact with an anime, a video game character, a celebrity, or a historical figure.
Note of caution: Many of the chatbots are explicitly romantic and/or sexualized.
One of the most popular Character.AI personalities is called “Psychologist.” It has already received more than 100 million chats.
The Verge reporters created hypothetical teen scenarios with the chatbot, which resulted in it making questionable mental health diagnoses and potentially damaging pronouncements.
Kelly Merrill, an assistant professor at the University of Cincinnati who studies the mental and social health benefits of communication technologies, is quoted by the website as saying, “Those that don’t have the AI literacy to understand the limitations of these systems will ultimately pay the price.”
The price for teens may be way too costly. According to the developers of the app, users spend an average of two hours a day interacting with their AI friends.
On Reddit, where the Character.AI forum has well over a million subscribers, many users indicate that they spend as much as 12 hours a day on the platform. The users also describe feeling addicted to chatbots.
Several of the apps that feature AI companions claim that their primary benefit is that these technologically contrived personas provide unconditional support to users, which in some cases may be helpful in preventing suicide.
However, the unconditional support of AI friends may turn out to be problematic in the long run.
An AI friend that constantly praises could amplify self-esteem to a distorted level, which could result in overly-positive self-evaluations.
Research indicates that such individuals may end up lacking in social skills and are likely to develop behavior that inhibits positive social interactions.
Fawning AI companions could cause teens who spend time with them to become more self-centered, less empathetic, and outright selfish. This may even encourage lawless behavior in some instances.
The intimacy in which teens are engaging with digitally contrived AI personalities poses the same problems that are associated with pornography in general. The effortless gratification provided may suppress the motivation to socialize, thereby inhibiting the formation of meaningful personal relationships.
The bottom line is there really are no substitutes for authentic relationships with fellow human beings.
Anyone who tries to convince you otherwise may already be missing a piece of their heart.
James Hirsen, J.D., M.A., in media psychology, is a New York Times best-selling author, media analyst, and law professor. Visit Newsmax TV Hollywood. Read James Hirsen's Reports — More Here.
© 2025 Newsmax. All rights reserved.