Facebook Explains ‘Report Suicidal Content’ Tool

Graffiti message including a Facebook "Like" thumb.
Start your day with TPM.
Sign up for the Morning Memo newsletter

Facebook on Tuesday announced a new suicide prevention tool that has since drawn enormous attention, praise and head-scratching around the Web.

“The new service enables Facebook users to report a suicidal comment they see posted by a friend to Facebook using either the Report Suicidal Content link or the report links found throughout the site,” wrote U.S. Surgeon General Dr. Regina Benjamin in a post on Facebook’s Safety Blog on Tuesday afternoon.

Now Facebook has explained the tool and the process that lead to its creation in greater detail to TPM.

As Facebook spokesman Frederic Wolens told TPM: “Facebook has been working closely with the suicide prevention community for many years, including a long standing relationship with the [National Suicide Prevention] Lifeline.

Wolens continued: “We have been an active participant in the Action Alliance (a private-public partnership aimed at suicide prevention) and hosted the New Media Partnership for Suicide Prevention at our offices last year. This new initiative was a natural extension of our work in this area.”

The tool, which is only available in the U.S. and Canada at present, is a drop-down menu that allows users to anonymously flag posts on the world’s most popular social network as: “Suicidal Content.”

It isn’t exactly readily apparent on the website, however.

Users can access the tool by clicking on the upper right hand corner of one of their friend’s Facebook posts, which will pull down the option to “Report story or spam.”

Clicking on this option will automatically mark the post as “Spam,” but Facebook now prompts users with the following new message: “If this story is abusive, please file a report.”

Clicking on “File a Report” will pull up yet another menu (shown below) that asks “Is this post about you or a friend?” and presents a list of options. Under the option “No, this post is about something else,” users can check “Violence or harmful behavior” and then further clarify that it is about “Suicidal Content,” or “Self-harm.”

They can also skip all those steps and enter the name of the person who posted the content and a link to their Facebook page using this form, though it is not apparently obvious how to access it without a link.

Once this happens, Facebook’s User Observations team will email the user who posted the content marked suicidal a link to a private web chat with a crisis representative from the National Suicide Prevention Lifeline.

Wolens told TPM that Facebook would avoid abuse of the tool, and people crying wolf, by having the User Observations team carefully screen all the reports and only sending the private link out to those that have been deemed “actionable.”

However, Wolens said that in cases where Facebook has information indicating the threat is imminent, the company will “take all possible action to get help.”

Wolens couldn’t provide specific examples of what else Facebook has done or will do in the event of a report of an imminent threat to a user’s safety due to privacy concerns, but said that the company would obviously call local emergency and law enforcement authorities to respond, which the company also urges users to do on its “Report Suicidal Content” entry form.

When asked whether Facebook was influenced by any particular events, including the tragic suicide of Rutgers student Tyler Clementi in September 2010, after Clementi posted a message on his Facebook wall saying “Jumping off the gw bridge sorry,” Wolens said that Facebook’s ongoing safety efforts were mode of an impetus.

As Wolens explained to TPM: “We have been working with the suicide prevention community for many years, and this program represented a natural extension to our work in this area rather than any specific events.”

Facebook also said that it is moving as quickly as possible to roll out the tool globally to its 800 million active users. As Wolens told TPM: “We are working with our global suicide prevention partners to provide the best help possible, we hope to work on similar initiatives with these partners in the coming months and years. “

Latest Idealab
Comments
Masthead Masthead
Founder & Editor-in-Chief:
Executive Editor:
Managing Editor:
Associate Editor:
Editor at Large:
General Counsel:
Publisher:
Head of Product:
Director of Technology:
Associate Publisher:
Front End Developer:
Senior Designer: