“I'm pleased that the Government has accepted the central recommendation of the DCMS Select Committee’s Report on Disinformation and ‘fake news’, that the social media companies should have a legal liability to take down harmful content hosted on their platforms.
“In our report we called for legislation that would create a regulatory system for online content that’s as effective as that for content offline. We’re pleased the Government has recognised the urgent need for independent regulation that we identified in our Report in order to move beyond self-regulation and set clear standards with the back-up of enforcement. We made the case for funding for the Regulator to come not from the tax-payer, but from a levy paid by social media companies that operate here. We’re pleased to see that has been adopted.
“It is right as well that an independent regulator, established in law, should have the power to oversee the compliance of the social media companies with these requirements. This should include a range of sanctions for failing to meet their duty of care to their users, including large fines, and personal liability for senior directors who are proven to have been negligent of their responsibilities.
“The Select Committee will be holding evidence sessions with Government ministers and regulators to discuss the White Paper during the consultation period. In particular we are keen to understand how quickly they will move to publish draft legislation and introduce it to Parliament. There is an urgent need for this new regulatory body to be established as soon as possible.
“It is also important that the Regulator has the power to initiate its own investigations into the social media companies when it is clear that they have failed to meet their duty of care to their users. This should include the power to discover why effective action was not taken, and who knew what and when within the company about it. The Regulator cannot rely on self-reporting by the companies. In a case like that of the Christchurch terrorist attack for example, a regulator should have the power to investigate how content of that atrocity was shared and why more was not done to stop it sooner.
“We need a clear definition of how quickly social media companies should be required to take down harmful content, and this should include not only when it is referred to them by users, but also when it is easily within their power to discover this content for themselves. The Regulator should also give guidance on the responsibilities of social media companies to ensure that their algorithms are not consistently directing users to harmful content.
“Disinformation is clearly harmful to democracy and society as a whole. The social media companies must have a responsibility to act against accounts and groups that are consistently and maliciously sharing known sources of disinformation. This is important not only in the fight against the disruption of democracy but also to combat the spreading of other harmful information, like anti-vaccination messages which are endangering the lives of young and vulnerable people. These responsibilities need to be clearly defined.
“The White Paper does not address the concerns raised by the Select Committee into the need for transparency for political advertising and campaigning on social media. We understand that this will soon be addressed separately by the Cabinet Office. Again, it is vital that our electoral law is brought up to date as soon as possible, so that social media users know who is contacting them with political messages and why. Should there be an early election, then emergency legislation should be introduced to achieve this.”