It’s not a matrix look: the youngest Igenerim members are returning to chatbot friends for everything, from serious tips to simple entertainment.
In previous years, technology has progressed so far to see users have gone straight to machine models for almost everything, and generations Z and Alpha are leading the trend.
Indeed, a May 2025 study by Common Sense Media looked at 1,060 American teenagers aged 13 to 17 in social life and found that a startling 52% of adolescents across the country use chatbots at least once a month for social purposes.
Teens who used chatbots to exercise social skills said they practiced beginners of conversation, expressing emotions, giving advice, conflict solutions, romantic interactions and self-defense-and almost 40% of these users applied these skills in real conversations later.
Despite some potentially useful skills developments, the study authors see the cultivation of antisocial behaviors, exposure to adulthood inappropriate content, and potentially harmful advice of adolescents as sufficient reasons to warn against minors.
“No one younger than 18 should use his friends,” the study authors writes at the end of the newspaper.
The real alarm bells began to sound when data found that 33% of users prefer to turn to friends and he is about real people when it comes to serious conversations, and 34% said a chatbot conversation has caused embarrassment, referring to both the subject matter and the emotional response.
“Until developers apply powerful age insurance beyond self-assurance, and platforms are systematically redesigned to eliminate the manipulation of the relationship and the dangers of emotional dependence, the potential for serious damage exceeds any benefit,” the study authors warned.
Although the use of it is certainly spread among the new generations-a recent study showed that 97% of Gen-Z have used technology-media study with ordinary sense found that 80% of adolescents said they still spend more time with IRL friends than online chatbots. Rest easily, parents: today’s teens still prioritize human ties, despite popular beliefs.
However, people of all generations have been warned against counseling for certain purposes.
As the post previously reported, Chatbots he and large language models (LLM) can be particularly harmful to those who seek therapy and tend to endanger those who show suicide thoughts.
“He, no matter how sophisticated, rely on the pre-programmed responses and large data,” said Niloufar Esmailpour, a clinical adviser in Toronto, previously told the post.
“They don’t understand” why “after one’s thoughts or behaviors.”
Sharing personal medical information with chatbots it may also have obstacles, as the information they adjust are not always accurate, and perhaps more alarming, they are not compatible with hipaa.
Loading work documents to get a summary can also lower you in hot water, as intellectual property agreements, confidential data and other company secrets can be potentially extracted and flow.
#growing #number #adolescents #addressed #friends #study #shows #experts #alarmed
Image Source : nypost.com