The Committee recommends the Government should assess whether failure to remove illegal material is in itself a crime and, if not, how the law should be strengthened. They recommend that the Government also consult on a system of escalating sanctions to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.
- Government should consult on stronger law and system of fines for companies that fail to remove illegal content
- Social media companies that fail to proactively search for and remove illegal material should pay towards costs of the police doing so instead
- Social media companies should publish regular reports on their safeguarding activity including the number of staff, complaints and action taken
Illegal and dangerous content
Government should, according to the report, consult on proposals requiring companies who fail to properly search for illegal material to pay towards the cost of policing and enforcement activities on their platforms.
Given their immense size, resources and global reach, the Committee considers it "completely irresponsible" that social media companies are failing to tackle illegal and dangerous content and to implement even their own community standards.
Responsibility of social media companies
The Committee criticises the "unacceptable" refusal by companies to reveal the number of people they employ to safeguard users, or the amount they spend on public safety. Quarterly, transparent reports which cover safeguarding, enforcement of standards, as well as the number of staff working on safety should be published.
The Committee also criticised social media companies for putting profit before safety – noting quick action is taken to remove content found to infringe copyright rules, but that the same prompt action is not taken when the material involves hateful or illegal content. The Committee recommends that the same expertise and technology should be applied for illegal and abusive content.
The Committee recognises the effort that has been made to tackle abuse on social media, such as publishing clear community guidelines, building new technologies and promoting online safety for example for schools and young people, but it's very clear from the evidence received that nowhere near enough is being done. Social media companies' enforcement of their own community standards is weak, haphazard and inadequate. Often smaller companies have even lower standards and are making less effort.
The Committee says Government should now conduct a review of the entire legal framework around online hate speech, abuse and extremism and ensure the law is up to date. Enforcement needs to be much stronger. What is illegal offline should be illegal – and enforced – online.
Lack of action
The Committee found repeated examples of illegal material not being taken down after they had been reported, including:
- Terror recruitment videos for banned jihadi and neo Nazi groups still live even after being reported by the Committee
- Antisemitic hate crime attacks on MPs even after being raised by MPs themselves and in a previous Committee report
- Material encouraging child abuse or sexual images of children, even after being reported by journalists
- Government should now assess whether the continued publication of illegal material and the failure to take reasonable steps to identify or remove it is in itself already a breach of the law, and how the law could be strengthened in this area.
- Football teams are obliged to pay for policing in their stadiums and immediate surrounding areas on match days. Government should now consult on adopting similar principles online – for example, requiring social media companies to contribute to the Metropolitan Police's CTIRU for the costs of enforcement activities which should rightfully be carried out by the companies themselves.
- Social media companies should publish quarterly reports on their safeguarding efforts, including analysis of the number of reports received on prohibited content, how the companies responded to reports, and what action is being taken to eliminate such content in the future. Transparent performance reports, published regularly, would be an effective method to radically drive up standards and encourage competition between platforms to find innovative solutions to these persistent problems. If they refuse to do this voluntarily, Government consult on forcing them to do so.
- The interpretation and implementation of community standards in practice is too often slow and haphazard. Social media companies should review with the utmost urgency their community standards and the way in which they are being interpreted and implemented, including the training and seniority of those who are making decisions on content moderation, and the way in which the context of the material is examined.
- Most legal provisions in this field predate the era of mass social media use and some predate the internet itself. The Government should review the entire legislative framework governing online hate speech, harassment and extremism and ensure that the law is up to date. It is essential that the principles of free speech and open public debate in democracy are maintained – but protecting democracy also means ensuring that some voices are not drowned out by harassment and persecution, by the promotion of violence against particular groups, or by terrorism and extremism.
Rt Hon Yvette Cooper MP, Chair of the Committee, said:
"Social media companies' failure to deal with illegal and dangerous material online is a disgrace. They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful. These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people's lives. This isn't beyond them to solve, yet they are failing to do so. They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe.
In this inquiry, it has been far too easy to find examples of illegal content from proscribed organisations – like National Action or jihadist groups – left online. And while we know that companies for the most part take action when Select Committees or newspapers raise issues, it should not take MPs and journalists to get involved for urgent changes to be made. They have been far too slow in dealing with complaints from their users – and it is blindingly obvious that they have a responsibility to proactively search their platforms for illegal content, particularly when it comes to terrorist organisations. Given their continued failure to sort this, we need a new system including fines and penalties if they don't swiftly remove illegal content.
Social media companies need to start being transparent about what they do. The idea that they can't tell us what resources they put into public safety for commercial reasons is clearly ludicrous.
The government should also review the law and its enforcement to ensure it is fit for purpose for the 21st century. No longer can we afford to turn a blind eye."
Dissolution of Parliament
The announcement of the General Election curtailed the Committee's consideration of the full range of issues in this inquiry, and the recommendations have had to be limited to dealing with online hate, arguably the most pressing issue which needs to be addressed now.
However, it is hoped that the successor committee in the next Parliament will return to this highly significant topic and will draw on the wide-ranging and valuable evidence gathered in this inquiry to inform broader recommendations across the spectrum of challenges which tackling hate crime presents.