COMMONS

Google’s response not good enough say Chair of Committee

06 June 2018

The Home Affairs Committee publishes Google’s response to its questions on content moderation and review processes on YouTube.

Google's undertaking

During an evidence session for its inquiry into hate crime in March, the Committee questioned William McCants, Global Leader for Counterterrorism at YouTube on the continued availability on the platform of material relating to National Action, a proscribed organisation, and other extremist groups, despite repeated undertakings given by Google to the Committee over a 12-month period that this content would be removed.

Request for additional information

After the session, Committee Chair Yvette Cooper wrote to Google requesting additional information about its content moderation and review processes.
 
Both the letter and Google’s response are attached. Google has responded confirming that out of 4,200 people working on content moderation, only 200 are directly employed and the rest are employed through contractors, and none of those 4,200 people are currently based in the UK.

Chair's Comments

Chair of the Committee, Yvette Cooper commented:

"Google’s response just isn’t good enough.
 
This incredibly rich and powerful global company has a huge responsibility to stop its platforms being used for crime, extremism and damage to young people. Yet in most cases it doesn’t even employ its own staff to work on tackling illegal or abusive content, it contracts the problem out to others instead. And it also turns out that none of those specialist reviewers are based in the UK at all.
 
If lack of directly employed staff in the UK explains why YouTube were so utterly hopeless at removing banned National Action videos it proves they need to think again. We raised those illegal videos repeatedly over twelve months with Google and YouTube top executives, yet we still found them on the platform. Google have already admitted to us that their content moderators weren’t sufficiently sensitive to far-right extremism and terror threats in the UK. Now we learn why, if none of them are based here.
 
Given that preventing illegal activity online should be a huge part of YouTube and Google’s central corporate purpose, it is frankly astonishing that less than 5% of those working on content moderation decisions are actually employed directly by the company.
 
When Mr McCants gave evidence to us he didn’t even know if the illegal videos we warned about had been reviewed by YouTube employees or outsourced, and nor could he tell us who conducted the training of those reviewers who were contracted. So there has clearly been no grip of the contracting or training arrangements from the centre. 
 
At a time when we know far-right extremism is on the rise in the UK, online companies have a responsibility to act proactively and decisively to do all they can to ensure extremist views are not given a platform, rather than responding only after the negative publicity of a public hearing.
 
We raised some of these issues in our interim report last year and are considering this evidence now in our final report on hate crime."

Further information

Image: iStockphoto

Share this page