Reddit currently has a feature titled:
“Someone is considering suicide or serious self-harm”
which allows users to flag posts or comments when they are genuinely concerned about someone’s mental health and safety.
When such a report is submitted, Reddit’s system sends an automated private message to the reported user containing mental health support resources, such as contact information for crisis helplines (e.g., the Suicide & Crisis Lifeline, text and chat services, etc.).
In some cases, subreddit moderators are also alerted, although Reddit does not provide a consistent framework for moderator intervention.
The goal of the feature is to offer timely support to users in distress and reduce the likelihood of harm.
However, there have been valid concerns about misuse—such as false reporting to harass users, or a lack of moderation tools or guidance for handling these sensitive situations.
Given Lemmy’s decentralized, federated structure and commitment to privacy and free expression, would implementing a similar self-harm concern feature be feasible or desirable on Lemmy?
Some specific questions for the community:
Would this feature be beneficial for Lemmy communities/instances, particularly those dealing with sensitive or personal topics (e.g., mental health, LGBTQ+ support, addiction)?
How could the feature be designed to minimize misuse or trolling, while still reaching people who genuinely need help?
Should moderation teams be involved in these reports? If so, how should that process be managed given the decentralized nature of Lemmy instances?
Could this be opt-in at the instance or community level to preserve autonomy?
Are there existing free, decentralized, or open-source tools/services Lemmy could potentially integrate for providing support resources?
Looking forward to your thoughts—especially from developers, mods, and mental health advocates on the platform.
ime as a subreddit mod that was nearly exclusively used for harassment, usually transphobic harassment. In the one or two cases where there was a report for someone who had suicidal or self-harm ideation, there’s still zilch I could have done; I would just approve the post so the user could get support and speak to others (the subreddit was a support group for a sensitive subject, so it wouldn’t be out of place for a post to say that the stress of certain things was making them suicidal).