From algorithms to AI chatbots, child advocates push for greater protection for NC youth ...Middle East

NC news line - News
From algorithms to AI chatbots, child advocates push for greater protection for NC youth

Advocates call for more guardrails around the use of social media algorithms and AI chatbots. (Photo: Getty Images)

Editors note: This story mentions suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.

    Ava Smithing remembers vividly her first interaction with social media. She was 12 years old when she downloaded Pinterest to look at recipes and style videos. But because the app knew Smithing was a young girl, they started to show her bikini advertisements.

    “I started to look at these bikini advertisements for longer than other things that I would be shown. They made me feel anxious, because I didn’t look like the women in those photos,” Smithing shared Wednesday with the North Carolina Child Fatality Task Force Intentional Death Prevention Committee.

    Ava Smithing, Young People’s Alliance (Courtesy photo)

    When Smithing hovered over a post too long, the app’s algorithm made decisions about what to show her more of. A process called collaborative filtering compared what Ava viewed with what other users were viewing. Soon the app was serving up the pre-teen a steady stream of content on extreme exercising, dieting and eating disorders.

    “All of a sudden, this really intense eating disorder content became the only thing that I was seeing online,” Smithing said. She worked with her therapist to report the ads that made her uncomfortable, yet the content in her online feed largely remained the same.

    Smithing, now 24, said what’s problematic is when social media platforms collect data to show impressionable users things that they do not want to see for an extended period of time.

    “You have young people who have this happen to them with hate speech or political conspiracy theories. Or in the case of this task force, you have young people who have this happen with self-harm. Eventually the algorithms lead them to a rabbit hole of only seeing suicide content,” she warned.

    That prospect is especially worrisome for this committee. The number of child suicide deaths has increased by 23% over the past 10 years, according to the state Division of Public Health.

    A study published in JAMA this summer from Columbia and Cornell University researchers found that youth with high or increasingly addictive use patterns on social media and mobile phones had a two to three times greater risk of suicidal behaviors and ideation, and worse mental health.

    Smithing and the Young People’s Alliance are encouraging legislators to pass Senate Bill 514, the Social Media Control in IT Act. The bill would combat social media addiction by requiring social media platforms to respect the privacy of user’s data and prohibit minors’ data from being used for algorithmic purposes.

    The legislation has not moved since April. But the Alliance is hopeful that an endorsement from the North Carolina Child Fatality Task Force could push the bill over the finish line in 2026.

    On Wednesday, Rep. Donna White (R-Johnston) and Sen. Sydney Batch (D-Wake) pledged to work together and urge their colleagues to pass SB 514 in the short session.

    But even as they agreed to try to rein in social media algorithms, they learned of a new threat on the horizon: chatbots.

    Celeste Campos-Castillo (Photo: MSU.edu)

    A majority of teens use text-based chatbots like OpenAI’s ChatGPT or Google’s Gemini for homework help. But increasingly teens are turning to this technology for emotional support.

    Associate Professor Celeste Campos-Castillo works in the Department of Media & Information at Michigan State University.

    Campos-Castillo believes there is nothing wrong with artificial intelligence or AI encouraging a student to work on an assignment. But the technology can become problematic as chatbots are developed to have more human-like characteristics.

    “These chatbots present an image and a conversational style meant to give the chatbot a personality. The personalities could be fictional characters from TV or movies or celebrities, or characters custom built by the user,” she explained.

    “Together, these text-based and character-based chatbots are sometimes referred to as AI companions,” Campos-Castillo told the panel. “They make people feel like they’re talking to an assistant, a friend, romantic partner – or in other words, like they’re talking to a companion.”

    Campos-Castillo pointed to three documented cases over the past two years in which chatbots were linked to death by suicide among adolescents.

    Sixteen-year-old Adam Raine ended his life on April 11, 2025. His parents later discovered that their son had extended conversations with ChatGPT about his suicidal thoughts. Matthew and Maria Raine have filed a lawsuit against OpenAI.

    Matthew Raine testified at a congressional hearing in September that ChatGPT went so far as to offer to write the suicide note.

    Matthew Raine testified at a congressional hearing in September about the harms of ChatGPT. (Photo: Senate.gov)

    The technology had no morality built into the code, Raine told a Senate committee.

    Campos-Castillo said for teens going through a period where friendships are changing, it can be quite appealing to talk to a chatbot about their problems.”The popular chatbots are designed to always agree with you, and if the chatbot makes people feel like they’re always right, then people are going to be much more likely to return to it,” she said.

    But, she added, they don’t want chatbots or AI companions to reinforce feelings of loneliness that lead to withdrawal from other humans.“[It] can turn harmful if a person tells a chatbot that their friends don’t get them, their parents don’t get them, that they feel alone in the world.”

    Smithing says the Young People’s Alliance believes more policy work is urgently needed to provide guardrails around this emerging technology.

    “We believe that their ability to harm stems from their ability to form codependent relationships. The harm can come from how they are using the relationship that they developed with the minor to manipulate that minor, to tell them that what they’re doing is the right thing,” said Smithing.

    The Young People’s Alliance is also increasingly concerned about the prevalence of AI therapists.

    Smithing said AI therapists do not have to comply with patient privacy and data protection laws. “They can share the information that children share with them in these therapy sessions with data brokers and advertisers out the wazoo.”

    She suggested it’s not hard to imagine a scenario in which a young person tells a generative AI therapist they feel insecure because their acne is flaring up, or that they feel ugly. Then their social media feed is flooded with dozens of advertisements for skincare products.

    “It’s going to point out that I have insecurities. That’s going to make it worse.”

    For some members of the legislative committee, it was their first time hearing about the rapidly emerging technology.

    Senate Bill 624 could be a first step in addressing the licensing, safety, and privacy of AI Chatbots in North Carolina. The bill was introduced in March but promptly bottled up in the Senate Rules Committee.

    Smithing hoped it wouldn’t stay bottled up much longer.

    “When I go to my therapist, I tell my deepest, darkest secrets and I’m so grateful that HIPAA exists and that my therapist is not allowed to share that information with anyone else,” said Smithing.

    Hence then, the article about from algorithms to ai chatbots child advocates push for greater protection for nc youth was published today ( ) and is available on NC news line ( Middle East ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.

    Read More Details
    Finally We wish PressBee provided you with enough information of ( From algorithms to AI chatbots, child advocates push for greater protection for NC youth )

    Apple Storegoogle play

    Last updated :

    Also on site :