Call for Facebook beheadings rethink

The latest stories from the Technology section of the BBC News web site.{}








Facebook RMC pageA Facebook user had complained about images of severed heads, including this picture, censored here


One of Facebook’s safety advisers is to call on it to introduce safeguards to prevent users from stumbling upon gruesome images.

The move follows complaints about photos showing severed heads taken in a part of Syria controlled by the jihadist group Islamic State (IS).

The firm initially refused to delete the images, saying they did not contravene its guidelines.

It later blocked the material after being contacted by the BBC.

Stephen Balkam, chief executive of the US’s Family Online Safety Institute (Fosi), said he planned to raise the issue next month at a meeting of the Facebook Safety Advisory Board, on which he serves.

“There may be instances in which graphic photos and videos, like the beheadings in Syria, can be justified as being in the public interest,” he told the BBC.

“However, if they are hosted on Facebook or other social media platforms, there should be two barriers put in place.

“First, an interstitial, or cover page over the graphic images. With an interstitial in place, a user, particularly a child, will not have the image appear in their timeline or be easily seen if they are sent a link to the images.

“Secondly, there should be an age gate, saying that you must verify that you are 18 years of age. While this is easily circumvented, it does at least warn the user and may well deter both kids and adults alike.”


Facebook warningFacebook did briefly cover up a video of a Mexican beheading in 2013

Mr Balkam had previously criticised Facebook last year after it rejected calls to delete a video clip showing a woman being beheaded in Mexico.

At the time, the site did briefly place a warning over the clip, before deciding to remove it on the basis that it “glorified” violence.

Facebook’s policy is that while the sharing of graphic content for sadistic pleasure is banned, the use of distressing images that are designed to condemn violence or to highlight an important issue is permitted.

“We do sometimes see people come to Facebook to share experiences of the world around them and on occasion this may result in content that some may find upsetting,” a spokesman told the BBC.

“We expect people that want to use Facebook to condemn or report on violence, to do so in a responsible manner, which may include warning people about the nature of content in the videos and imagery they’re sharing and carefully selecting the audience for the content.

“Our goal is to strike a balance between allowing people to comment on the often brutal world around them, whilst protecting people from the most graphic of content.”


Arabic translation

The most recent controversy centred on photos of severed heads posted to a Facebook page operated by a group called the Raqqa Media Center (RMC), which is based in an Islamic State-controlled city in Syria.


Raqqa Media CenterThe Raqqa Media Center operates in an IS-controlled part of Syria

A producer from the broadcaster Al Jazeera reported the material to the site earlier this week, in a personal capacity.

She received a response saying: “It doesn’t violate our community standards.”

When the BBC subsequently contacted Facebook, a spokesman initially defended the decision noting that “the page is run by a Syrian opposition group, not IS”.

He also referred to a 2013 blog post in which RMC complained that its members had been harassed by Isis, the name formerly used by IS.

Facebook’s rules ban IS and “terrorist groups” from using its site.

However, its community managers appear to have overlooked that one image posted on 24 August – which showed a man’s foot pressed against a recently severed head – was accompanied by text in Arabic that did appear to glorify the violence involved.

“Our people in Tabqa suffered, now we are stepping on the heads of people working in the airports,” it stated.

This might have been missed because Facebook’s site offers a translation service powered by Microsoft, which produced an incomprehensible interpretation. However, Facebook’s review team does employ Arabic speakers.

Observers note that the material posted by RMC has changed in tone since 2013, and now adopts IS terminology and rhetoric.


IS at TabqaIS militants took control of the Tabqa airbase from the government last weekend

“They are operating in a city that is under jihadist control and it appears that the media environment there has become increasingly restrictive,” said Steve Metcalf of BBC Monitoring, which provides analysis to the UK government and others.

“IS is said to have imposed censorship and vetting of material posted online or distributed to other news organisations.

“This seems to be reflected in recent postings by RMC, which refrain from any criticism of IS, even posting eulogies for some dead IS fighters.”

Less gruesome images released by RMC are still used by several parts of the mainstream media, including the Daily Mail, Fox News and Washington Times.

However, its account had been suspended on Twitter and no videos have appeared on its YouTube account since July.

Facebook has now blocked all access to RMC’s page on its site.


Warning tags

One expert said Facebook needed to do more to protect its 1.3 billion users from distressing material.

“Other sites have long worked this out,” said Dr Bernie Hogan, a social networks researcher at the University of Oxford. “Reddit, for instance, now uses the ‘not safe for work tag’ to restrict images that are violent in nature and clearly reprehensible to people.

“Facebook should follow in those footsteps and have, if not a zero-tolerance policy, at least some way for the community to very easily tag something as vulgar or violent.”


RMC TwitterThe Twitter account used by RMC was suspended earlier this month

But the digital rights group La Quadrature du Net said it would be concerned if Facebook started acting as “some kind of private police” by deciding itself which images should be hidden.

“Users could have an option to flag them, and after five, 10 or 100 such flags, such a page could be automatically added,” said its co-founder Jeremie Zimmermann.

“There is a whole difference between allowing users to flag content and asking Facebook to do it.”

http://www.bbc.co.uk/news/technology-28949472#sa-ns_mchannel=rss&ns_source=PublicRSS20-saBBC News – Technology