YouTube has quietly added a new category viewers can use to flag inappropriate videos: “promotes terrorism.”
The new category, which went live in early November, is a subcategory of the “Violent or Repulsive Content” category users can choose when they report material on the site. The addition occurred shortly after lawmakers in the U.S. and Britain pressured the site to do something about terror-related content.
Last month, as The New York Times reported, the site removed hundreds of videos featuring American cleric Anwar Awlaki, who is connected to Al Qaeda in the Arabian Peninsula, after taking heat from Rep. Anthony Weiner (D-NY) and British officials. But many videos of Awlaki remain easily accessible on the site. The new category will let users help the site keep track of terror-related videos amid the 24 hours worth of video uploaded every minute.
Users flag thousands of videos on YouTube every day, and humans review those reports 24 hours a day. Not all flagged videos are taken down, however, and those that are found to be newsworthy or contain religious speech will not be removed.
A Youtube spokesperson told TPM: “To make it easier for our users to identify content that incites violence, last month we added a flag labeled “promotes terrorism” to the flagging menu. We review flagged videos around the clock, and if we find that they violate our policies, we take them down.”
Sen. Joe Lieberman (I-CT) called the new protocols a “good first step toward scrubbing mainstream Internet sites of terrorist propaganda,” according to the Los Angeles Times. Still, he added, “it shouldn’t take a letter from Congress — or in the worst possible case, a successful terrorist attack — for YouTube to do the right thing.” But George Washington University Law Professor Jeffrey Rosen called the move “potentially troubling.” Rosen argued that the phrase “promotes terrorism” is more subjective than YouTube’s other flagging categories.
[Ed note: This article was edited after publication]