YouTube Reviewers OK Terror Group Videos

John Lister's picture

Google says human error allowed videos from a banned Neo-Nazi group to remain on YouTube despite reports. It says it will look at both human and automated changes to stop it happening again.

The videos involved National Action, a far right Neo-Nazi group in the United Kingdom. The government there has proscribed the group. That's a special measure for groups that are actively carrying out, preparing or promoting terrorism rather than merely expressing offensive views. British law means it's illegal to be a member of a proscribed group or take part in its meetings.

The UK government describes the group as "virulently racist, anti-Semitic and homophobic. Its ideology promotes the idea that Britain will inevitably see a violent 'race war', which the group claims it will be an active part of." It also notes the group celebrated the terrorist murder of a British politician. (Source: parliament.uk).

Human Reviewers Made 'Wrong Call'

Politicians questioned Google's counter-terrorism chief William McCants this week about YouTube hosting four propaganda video for the group. While it's legally debatable if the video itself is illegal, it's most definitely against the YouTube Terms of Service.

The videos have been reported multiple times over the past year, including eight reports from politicians. McCants said that the most recent cases of the videos going online were caught by its system but mistakenly allowed to stay online.

Three of the videos were flagged automatically by YouTube's systems with the fourth spotted by a human reviewer. All four were then manually checked, which McCants said was necessary to check the context of the footage: for example, a clip from the videos could be used legally (and within the site terms of service) if it was part of a news report on the subject.

Specialists Will Deal With Future Clips

McCants said the manual reviewers then made the wrong decision in leaving the video online. He says in future such reviews will be carried out by specialist experts who are familiar with proscribed groups. He also said training for all reviewers will be improved.

There will also be a tweak to the automated reviews so that the system can spot and flag shorter clips than it is currently able to detect. (Source: bbc.co.uk)

What's Your Opinion?

Should video hosting sites enforce content standards, or is it easier/justifiable to simply say that anything goes as long as it's legal? Can automated systems ever do a reliable job of judging the context of a clip? Is it realistic to expect companies to hire enough trained human reviewers to adequately cover the sheer number of clips that sites like YouTube host?

Rate this article: 
Average: 4.6 (7 votes)

Comments

rwells78's picture

I am a fan of freedom of speech, but understand that is not a view shared worldwide. As such, I can't support censorship of any viewpoint in any medium.