The social network promoted some of the content too, according to a report
|
CLICK TO FOLLOW
THE INDEPENDENT TECH
THE INDEPENDENT TECH
Facebook could be prosecuted in the UK for failing to remove child abuse images posted to the site despite being made aware of them, according to a new report.
“Dozens” of illegal images and videos were allegedly flagged to the social network’s moderators by a reporter using a fake profile, according to the Times.
The illegal content is said to include a video of an apparent sexual assault on a child, “violently paedophilic cartoons”, an “Islamic State beheading” and propaganda posters “glorifying recent terrorist attacks in London and Egypt”.
However, they reportedly only took some of them down, concluding that the majority didn’t breach the site’s Community Standards, which states, "We remove content that threatens or promotes sexual violence or exploitation. This includes the sexual exploitation of minors and sexual assault."
Gadgets and tech news in pictures
Regarding violence and graphic content, it continues: "We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence."
The report also claims that Facebook’s algorithms actually promoted some of the content by inviting users to join the groups that had published it, and that the site only removed a number of the images after being contacted by the publication.
“We have removed all of these images, which violate our policies and have no place on Facebook,” Justin Osofsky, Facebook’s vice president of global operations, told the Times.
“We are sorry that this occurred. It is clear that we can do better, and we’ll continue to work hard to live up to the high standards people rightly expect of Facebook.”
The Internet Watch Foundation recently revealed that paedophiles are using increasingly advanced techniques to hide child sexual abuse imagery online, but said that social networks are among the least abused site types.
“This is yet another appalling example of Facebook's failure to remove inappropriate and disturbing content from its own website despite users flagging it to them,” said an NSPCC spokesman.
“It poses serious questions about what content they consider appropriate and what their community standards actually consist of if they are disregarding users’ legitimate concerns.
“More and more young people are telling Childline about upsetting content they are seeing online so it’s crucial that social media platforms stop making up their own rules when it comes to child safety.”
Suspected child sexual abuse images and videos can be reported anonymously through the IWF website.
No comments:
Post a Comment