Suicide Prevention

Overview

Experts say that one of the best ways to help prevent a suicide is for people in distress to hear from others who care about them. Facebook has a unique role to play —through friendships on the site— in connecting people in distress with people who can offer support. Whether you’re worried about someone you know or you're struggling on your own, we hope Facebook can help.

I need help for myself or for a friend now:

If you or a friend is going through a difficult time and want support we’d like to connect you with people who can help.

For Me

If you're going through a difficult time and want support, we'd like to connect you with tips and people who can help.

For a Friend

Review tips from suicide prevention experts to understand the best ways to support a person who's going through a difficult time.

Connect with Experts

Contact a helpline if you are looking for support, or if you need help supporting a friend. If you're concerned about someone else, encourage them to contact a helpline as well.

If someone is in immediate danger, call local emergency services immediately. Don't wait.

Our effortsExpand or Collapse Jumplinks Menu

Our efforts on suicide prevention

There is one death by suicide in the world every 40 seconds, and suicide is the second leading cause of death for 15-29 year olds. Because of the relationships people have on Facebook, we are in a unique position to help connect those in distress with friends or experts who can show support. Suicide prevention experts say these connections can be helpful in preventing suicide, and we see it happen in a variety of ways.

Since 2006, we’ve built our approach with experts in suicide prevention and safety and with input from people who have lived experience. We seek input on an ongoing basis on current research and best practices to ensure everyone’s safety is being considered. Learn more about our ongoing engagement with experts.

For years, people have had the ability to report Facebook posts that they feel indicate someone is thinking about suicide. Trained members of our Community Operations team will review these reports and connect the poster with support resources if needed. When there’s risk of imminent harm, we work with emergency responders who can help.

In 2017, we began to use machine learning in many countries to expand our ability to identify possible suicide and self-injury content and to get timely help to people in need. This technology uses signals, such as phrases in posts and concerned comments from friends and family, to identify possible suicide or self-injury content. Depending on the content identified, it may be escalated for further review by members of our Community Operations team, who may decide to take additional steps such as recommending to contact emergency services.

Something Went Wrong
We're having trouble playing this video.
Our PoliciesExpand or Collapse Jumplinks Menu

Our Policies

We care deeply about the safety of our community, and with the advice of experts, we set policies for what is and isn’t allowed on Facebook. For example, while we don’t allow people to celebrate or promote self harm or suicide, we do allow people to discuss suicide and self-injury because we want Facebook to be a space where people can share their experiences, raise awareness about these issues, and seek support from one another. In some instances, we may restrict content to adults over the age of 18, include a sensitivity screen, and provide resources so that people are aware the content may be upsetting.

We work continuously with experts from around the world to strike a balance between important goals that are sometimes at odds. For example, when someone posts about self-harm, we want the person to be able to ask for help or share their path to recovery, but we must also consider the safety of the people who see that post. The post may unintentionally trigger thoughts of self-harm or suicide in others. We don't want people to share content that promotes self-harm, but we also don't want to shame or trigger the person who posted the content by removing their post.

We also constantly re-examine how we’re doing as we develop new products or see people using our services in new ways.

You can read about our suicide and self-injury policies by visiting the Community Standards website.

ResourcesExpand or Collapse Jumplinks Menu

Connecting people in need with resources

When someone is expressing thoughts of suicide, it can be critical to get help as quickly as possible.

Suicide prevention resources have been available on Facebook for more than 10 years. These resources were developed in collaboration with mental health organizations such as Save.org, National Suicide Prevention Lifeline, Forefront and Crisis Text Line, as well as with input from people who have personal experience thinking about or attempting suicide.

If people see someone posting content about suicide, they can report the post and it will be reviewed by trained members of our Community Operations team, who can connect that person with support resources if needed.

In 2017, we began using machine learning in many countries to expand our ability to identify possible suicide or self-injury content and to get timely help to people in need. This technology uses pattern-recognition signals, such as phrases in posts and concerned comments from friends and family, to identify possible suicide or self-injury content. This helps us respond to reports faster.

We use artificial intelligence to prioritize the order in which our team reviews reported posts, videos and live streams. This ensures we can efficiently enforce our policies and get resources to people quickly. It also lets our reviewers to prioritize more urgent posts, allowing them to contact emergency services when members of our community might be at risk of harm. Speed is critical.

In addition to those content-moderation AI tools, we’re using automation so the team can more quickly access the appropriate first responders’ contact information.

By using technology to prioritize and streamline these reports, we are able to escalate the content to our Community Operations team, who can more quickly decide whether there are policy violations and whether to recommend contacting local emergency responders. We are committed to continuing to invest in technology to better serve our community.

How the text and comment classifiers work

In 2017, we began to use machine learning in many countries to expand our ability to identify possible suicide or self-injury content and to get timely help to people in need. This technology uses signals, such as phrases in posts and concerned comments from friends and family, to identify possible suicide or self-injury content. Posts that are likely to indicate an imminent risk of harm tend to have comments like “Tell me where you are” or “Has anyone heard from him/her?” while potentially less urgent posts have comments more along the lines of “Call anytime” or “I’m here for you.” We understand from experts that day and time of the original posting can be important factors of when someone may be contemplating suicide, so we also include these variables to help our technology prioritize potential suicide and self-injury-related content for our reviewers.

We created two text classifiers: one for predicting the main text of posts and another that predicts for the comments. The scores from each of these classifiers, combined with other factors (such as the time of day, day of the week, or post type—for instance whether the post was made on a person’s own timeline or a friend’s timeline) provide the inputs to a random forest learning algorithm, which specializes in learning using numerical features.

The random forest algorithm builds a classifier that scores posts according to their similarity with previously identified posts that express suicidal thoughts. If this classifier gives a post a high score, it is sent to our Community Operations team for review to decide whether to remove the post in line with our policies, send resources (which include details of local organizations that can provide support), escalate the post for outreach to authorities, or leave the post on the platform.

It is important to note that, as described above, our algorithms are intended to help identify content related to suicide and self-injury to help us remove content that violates Facebook's policies and to flag certain content for review by our Community Operations team. The algorithms do not perform any clinical or diagnostic functions and are not intended to diagnose or treat any mental health or other condition. We do not provide treatment, therapy, coaching, or medical advice, nor does Facebook replace professional health care services. People experiencing an emergency or emotional crisis should contact emergency services or a helpline immediately.

ToolsExpand or Collapse Jumplinks Menu

Tools

Our technology to identify possible suicide and self-injury Facebook posts is also integrated into Facebook Live. If it seems like someone in a live video is considering self-harm, people watching the video have the option to reach out to the person directly and to report the video to us.

Whether a post is reported by a concerned friend or family member or identified by machine learning, the next step is the same: a member of Facebook’s Community Operations team reviews the report to determine whether there are any policy violations and if there may be an imminent risk of self-harm-—and if so, the original poster is shown support options. For example, we encourage people who are going through a difficult time to reach out to a friend, and we even offer pre-populated text to make it easier for people to start a conversation. We also suggest contacting a helpline and offer other tips and resources for people to help themselves in that moment.

In serious cases, where our Community Operations team is concerned about imminent danger of self-harm, Facebook may contact emergency services to conduct a wellness check. Thanks to our technology or as a result of reports from friends and family on Facebook and Instagram, we’ve helped first responders quickly reach people globally who needed help. We also provide resources and support to the person who flagged the troubling post, including options for them to call or message their distressed friend letting them know they care, or reaching out to another friend or a trained professional at a suicide hotline for support. All of these resources were created in partnership with our clinical and academic partners. In addition, our content-moderation efforts have helped to significantly reduce the amount of harmful content that people are exposed to.

Partner Testimonials

Previous

“Facebook works with mental health and suicide prevention experts on an ongoing basis to ensure they are using the most current research and knowledge to help them develop guidelines and best practices for determining what content and when certain content might be triggering for others that should be removed or less visible to others. When someone expresses suicidal distress, it provides family, friends and even Facebook and Instagram the opportunity to intervene. If people can’t share their pain, or it is shared and then removed, we’ve missed a chance to save someone’s life. We train people to listen for this in conversations and to allow people to keep talking because we know that it is one way to help them through a crisis. Social media platforms allow us to do this in a way that brings many people to help very quickly.”

-Daniel J. Reidenberg, Psy.D., FAPA, Executive Director, Save.org

“Mental illness and thoughts about suicide are just not something we talk about OPENLY. Yet talking and connecting is crucial to helping prevent depression and suicide. The tools Facebook is rolling out, aim both at people who are expressing suicidal thoughts and also guide concerned friends or family members to resources and alternatives and appropriate interventions. People use Facebook widely, so there's an opportunity to actually connect someone who is struggling, to a person they have a relationship with. This is extremely important.”

-Anna Chandy, Chairperson - Trustees, The Live Love Laugh Foundation) — India

“The Finnish Association for Mental Health is pleased to work with Facebook to provide support and resources for people who are feeling vulnerable and at risk of suicide and their close ones. To have this support available is important because when in crisis, people often don´t have strength or courage to seek help and can even find it hard to realize that they need help. To have resources and contact information for experts made available in the language people speak is crucial. It also makes it easier for a friend or family member to help someone who may be having thoughts suicidal or self-harm.Studies and our experiences show, that suicide crises can be overcome. It´s important to know where to contact and when. Facebook is working towards this and we are happy to be a part of this important work.”

-Satu Raappana-Jokinen, Manager of Online Crises services, The Finnish Association for Mental Health

“Suicide is a public health problem that has become more serious year after year, in great part due to the ’tabu’ around it. Any initiative that helps raise awareness and prevent suicide is essential to reduce the tragic statistics showing that one Brazilian dies every 45 minutes. Some people can identify signals that a friend is thinking about suicide but do not know how to help. In these cases, tools like the one launched by Facebook in Brazil can directly and indirectly help this cause that has been embraced by CVV for more than 50 years and serve as a model for other organizations to overcome ‘tabus’ and embrace further actions in this subject.”

-Carlos Correia, Volunteer, Centro de Valorização da Vida) - Brazil

Next