Instagram said Thursday it will start alerting parents if their kids repeatedly search for terms clearly associated with suicide or self-harm. The alerts will only go to parents who are enrolled in Instagram’s parental supervision program.
Instagram says it already blocks such content from showing up in teen accounts’ search results and directs people to helplines instead.
The announcement comes as Meta is in the midst of two trials over harms to children. A trial underway in Los Angeles questions whether Meta’s platforms deliberately addict and harm minors. Another, in New Mexico, seeks to determine whether Meta failed to protect kids from sexual exploitation on its platforms. Thousands of families — along with school districts and government entities — have sued Meta and other social media companies claiming they deliberately design their platforms to be addictive and fail to protect kids from content that can lead to depression, eating disorders and suicide.



I cannot even imagine giving a social media platform enough information to even do this. Maybe just don’t?
I think that’s a very easy thing to say, but for the younger generations, using social media isn’t too dissimilar from breathing. It’s just something you do.
You can do it without revealing your real identity.