Help Users Struggling With Self-Harm and Suicide

How will you identify users at risk of hurting themselves?

Request a Demo

Send us your details in the form below, and we'll be in touch within 24 hrs.

Thank you, your message has been sent successfully. We’ll be in touch within 1 business day.
Uh oh... someone with that email address has already requested a demo in the past.

There is a lot of talk on the Internet about suicide and self-harm.

When you created your social product, did you consider how you would handle these sensitive topics? How do you know if it’s just teen emo talk, or if a person is in real danger of harming themselves? How will you handle users who are truly considering self-harm or suicide? Cries for help should be taken seriously because the consequences can be tragic. There is nothing more heartbreaking than suicide.

When it comes to delicate topics like self-harm and suicide, it’s best to turn to the experts. We’ve designed our system to recognize the crucial difference between a statement like “I want to kill myself” and “You should kill yourself” — a difference that can literally mean life or death. We have partnered with major law enforcement agencies to study real chat by users going through difficult times, and have identified hundreds of linguistic patterns related to suicide and self-harm. We can design triggers to alert you in real-time when a user has posted alarming content more than once.

Our team of industry veterans can guide you through the process of making the tough decisions, based on your community — should you block all suicide-related content, or just suicide-related cyberbullying? Where can you direct at-risk users who need professional help? By asking the right questions, we’ve helped many social products build a strategy for dealing with self-harm and suicide in their communities.

You will likely deal with the following:

  • Real cries for help. Users will reach out when they need help. We recommend using automation to push messages of encouragement and links to hotlines like the Crisis Text Line or National Suicide Prevention Lifeline where at-risk users can receive professional assistance.
  • Volume. Users often joke about killing themselves or use over-the-top suicidal language to make a point, so you will see a large amount of potentially alarming user generated content. You could manually review everything, but that’s time-consuming and inefficient. To help prioritize, your best option is to leverage automation and escalate content based on user reputation — users who post extreme content consistently should be reviewed by human moderators.
  • Bullying and threats. We’ve identified a behavioral flow that shows a direct link between cyberbullying/harassment and self-harm/suicide. When users are bullied, they are more likely to turn to suicidal thoughts and self-harming behavior. It’s important that you filter cyberbullying in your product to prevent vulnerable users from getting caught in a vicious circle.

Suicide and self-harm are challenging topics to face. If acknowledging them — and creating an effective strategy for dealing with them in your community — helps save just one life, then it’s worth it. We would love to hear from you and help you craft a strategy that can save real lives.

Request A Demo

Want to see what Community Sift can do for your social product? Request a demo.

Get Our Newsletter