Facebook Fails to Take Down Images Exploiting Children
A major news organization says Facebook doesn't do enough to remove images that exploit children. The British Broadcasting Corporation (BBC) says the site removed just 18 out of 100 images that it reported as part of an investigation.
The test was part of a follow-up to a report last year about people using members-only groups on the site to share inappropriate images of children. At the time, Facebook said it was improving its moderation system.
To see if that was the case, the BBC used Facebook's formal reporting button to report 100 images that appeared to breach the site's stated guidelines. They included a mixture of explicit images along with those that were not inherently explicit but had been posted with inappropriate comments in groups dedicated to child exploitation.
Only One in Five Pics Removed
The BBC waited until it had a response for every image. 18 were removed, but for each of the remaining 82, Facebook sent an automated message saying the image didn't breach community standards. (Source: bbc.co.uk)
The figures raise questions about exactly how Facebook assessed the images in question. That the pictures remained online may suggest the moderation process was highly automated and didn't assess the context of the images and the pages on which they appeared.
Facebook's rules also bar anyone convicted of particular crimes against children from having a profile on the site. The BBC says it reported five accounts that breached these rules, but none were removed.
BBC Interview Prompts Facebook to Crime Authority
When the BBC asked for an interview, Facebook agreed but said it would first need to see examples of the material that the BBC had reported and was still online. When the BBC provided the examples, Facebook immediately reported the infractions to the National Crime Agency - the body that coordinates efforts by regional police forces to fight serious and organized crime.
Facebook has since issued a statement which reads: "We have carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards. This content is no longer on our platform. We take this matter extremely seriously and we continue to improve our reporting and take-down measures. Facebook has been recognized as one of the best platforms on the Internet for child safety." (Source: techcrunch.com)
What's Your Opinion?
Is Facebook doing enough to remove inappropriate images? Should images of children that aren't inherently illegal be removed if it's clear they are being used in an exploitative fashion? Is it practical to apply nuance and context to moderation on a site with as much content as Facebook?
Infopackets Top Windows 10 FAQs
How to Upgrade from Windows 10 32-bit to 64-bit
How to Fix: Windows 10 Antivirus Missing, Not Compatible
How to Fix: Windows 10 Display Shifted; Screen Fuzzy
How to Upgrade Windows 7, 8 32-bit to Windows 10 64-bit
to Downgrade from Windows 10
- How to Fix: Windows 10 Upgrade Failed Error C1900208
- How to Fix: Windows 10 Upgrade Failed Error 80240020
- Can I Cancel my Windows 10 Reservation and Reserve Later?
- How to Clean Install Windows 10 using Windows 7, 8 License
- Will Windows 10 Install Automatically?
- Windows 10 Upgrade: Do I have to Reinstall Programs?
- Windows 10 Upgrade: Can I choose 32-bit or 64-bit?
- Which Version of Windows 10 Will I Get (Home or Pro)?
- How to Reserve Windows 10 Upgrade (Free)
- How to Fix: CPU Not Compatible with Windows 10 Error
- Windows 10 Upgrade: Can I keep my Old Windows Install?
- How to Cancel Windows 10 Reservation (Properly)
- Download Windows 10 .ISO (DVD) for Clean Install?
- Microsoft: Windows 10 Will Be The Last Version
- Does Windows 10 require the CPU to support PAE?
- Windows 10: Can I Upgrade or do I need a Clean Install?
Click here for more Windows 10 articles.