LSE media expert and Government adviser Damian Tambini argues that social media companies have a 'duty of care' to protect users from harms caused by content published on their platforms, in response to the government's policy proposals in its White Paper on Online Harms.
He argues that the government is correct to propose a new institution, Ofweb, with the power to regulate online content in order to combat the significant harms caused by hate speech, foreign interference in democracy, images of self-harm, and terrorist content online. Yet he also warns of the potential dangers in the approach of the White Paper, which could inhibit freedom of expression if the harms are not clearly defined.
The policy brief proposes a detailed distinction between harmful but legal content and illegal content, and that illegal content should be met with sanctions including civil fines.
Tambini tackles the central legal and constitutional problem regarding a new code of conduct for legal harms such as political speech that interferes in the democratic process – so-called ‘fake news’. He finds that such censorship-like functions would not accord with the European Convention on Human Rights free speech test on proportionality, legality (parliamentary oversight), and necessity in a democratic society.
Therefore, Parliament must decide if new offences and categories of content require new laws and liabilities and set standards for blocking or filtering the most dangerous content. Given the dynamic nature of online harms, the process for introducing new laws to reflect harms should be more efficient and evidence-based, with advice from the new regulator.