Facebook has shared an update on World Suicide Prevention Day on what it has learned and steps it has taken in the past year, as well as additional actions it is going to take, to make its apps safe for the people,primarily those who are most vulnerable. Earlier this year, facebook began hosting regular consultations with experts from around the world to discuss some of the more complex topics associated with suicide and self-injury. The consultations embodies how Facebook deals with the risks of sad content online,suicide notes and newsworthy depictions of suicide.The details of these extensive meetings are available on Facebook’s updated Suicide Prevention page in it Safety Center. As a result of these consultations, Facebook has made several changes to improve how it handle this content. “We tightened our policy around self-harm to no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm, even when someone is seeking support or expressing themselves to aid their recovery. On Instagram, we’ve also made it harder to search for this type of content and kept it from being recommended in Explore. We’ve also taken steps to address the complex issue of eating disorder content on our apps by tightening our policy to prohibit additional content that may promote eating disorders. And with these stricter policies, we’ll continue to send resources to people who post content promoting eating disorders or self-harm, even if we take the content down. Lastly, we chose to display a sensitivity screen over healed self-harm cuts to help avoid unintentionally promoting self-harm,” said Facebook in a statement. Facebook is now hiring well-being and health experts to join the team amid its engagement with experts has proven so valuable to enhance the program and consultancy. “This person will focus exclusively on the health and well-being impacts of our apps and policies, and will explore new ways to improve support for our community, including on topics related to suicide and self-injury”. “And for the first time, we’re also exploring ways to share public data from our platform on how people talk about suicide, beginning with providing academic researchers with access to the social media monitoring tool, CrowdTangle. To date, CrowdTangle has been available primarily to help newsrooms and media publishers understand what is happening on Facebook. But we are eager to make it available to two select researchers who focus on suicide prevention to explore how information shared on Facebook and Instagram can be used to further advancements in suicide prevention and support. In addition to all we are doing to find more opportunities and places to surface resources, we’re continuing to build new technology to help us find and take action on potentially harmful content, including removing it or adding sensitivity screens. From April to June of 2019, we took action on more than 1.5 million pieces of suicide and self-injury content on Facebook and found more than 95% of it before it was reported by a user. During that same time period, we took action on more than 800 thousand pieces of this content on Instagram and found more than 77% of it before it was reported by a user”. According to experts, one of the most effective ways to prevent suicide is for people to hear from friends and family who care about them. To help young people safely discuss topics like suicide, the social networking website is enhancing its online resources by including Orygen’s #chatsafe guidelines in Facebook’s Safety Center and in resources on Instagram when someone searches for suicide or self-injury content. The #chatsafe guidelines were developed together with young people to provide support to those who might be responding to suicide-related content posted by others or for those who might want to share their own feelings and experiences with suicidal thoughts, feelings or behaviors.