Instagram said Thursday it will start alerting parents if their kids repeatedly search for terms clearly associated with suicide or self-harm. The alerts will only go to parents who are enrolled in Instagram’s parental supervision program.

Instagram says it already blocks such content from showing up in teen accounts’ search results and directs people to helplines instead.

The announcement comes as Meta is in the midst of two trials over harms to children. A trial underway in Los Angeles questions whether Meta’s platforms deliberately addict and harm minors. Another, in New Mexico, seeks to determine whether Meta failed to protect kids from sexual exploitation on its platforms. Thousands of families — along with school districts and government entities — have sued Meta and other social media companies claiming they deliberately design their platforms to be addictive and fail to protect kids from content that can lead to depression, eating disorders and suicide.

  • FlashMobOfOne@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    15
    ·
    8 hours ago

    Respectfully, I’ve heard all the arguments and do not care to litigate it again.

    What you and I think is immaterial. These are the only realistic outcomes in the current ecosystem. You can go have pointless arguments about it with someone else.

    • Typhoon@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 hours ago

      I’m gonna make a statement and then say I don’t want to talk about the thing I just brought up.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      8 hours ago

      Parental controls are a thing that exist. Used to be parents were responsible for monitoring what their children do, not the government or private corporations