Instagram said Thursday it will start alerting parents if their kids repeatedly search for terms clearly associated with suicide or self-harm. The alerts will only go to parents who are enrolled in Instagram’s parental supervision program.

Instagram says it already blocks such content from showing up in teen accounts’ search results and directs people to helplines instead.

The announcement comes as Meta is in the midst of two trials over harms to children. A trial underway in Los Angeles questions whether Meta’s platforms deliberately addict and harm minors. Another, in New Mexico, seeks to determine whether Meta failed to protect kids from sexual exploitation on its platforms. Thousands of families — along with school districts and government entities — have sued Meta and other social media companies claiming they deliberately design their platforms to be addictive and fail to protect kids from content that can lead to depression, eating disorders and suicide.

  • cabbage@piefed.social
    link
    fedilink
    English
    arrow-up
    15
    ·
    8 hours ago

    The neat thing about algorithmic social media is that content relating to suicide and self-harm inspires a lot of interaction among teenagers, causing it to be shoved in their faces whether they search for it or not.

    Suicidal teenagers are not searching for suicide material on Instagram; Instagram is feeding suicide material to regular teenagers for ad views.