Did we make a difference?

The Foundation for Law, Justice and Society (FLJS) is an independent not-for-profit institution that aims to promote an understanding of the role of law in society. We identify and analyse issues of contemporary interest and importance, disseminating the insights of decision-makers and experts to a global audience through our extensive online resource of free-to-download Policy Briefings, Opinion Pieces, and multimedia podcasts.

We want to keep our content free at the point of use to all. If you value our work and are able to support it, please make a contribution to enable us to fulfil our educational aims into the future.

 

 

 
 

Reducing Online Harms through a Differentiated Duty of Care: A Response to the Online Harms White Paper

Author: 
Damian Tambini
Publication date: 
Tue, 25 Jun 2019

LSE media expert and Government adviser Damian Tambini argues that social media companies have a 'duty of care' to protect users from harms caused by content published on their platforms, in response to the government's policy proposals in its White Paper on Online Harms.

He argues that the government is correct to propose a new institution, Ofweb, with the power to regulate online content in order to combat the significant harms caused by hate speech, foreign interference in democracy, images of self-harm, and terrorist content online. Yet he also warns of the potential dangers in the approach of the White Paper, which could inhibit freedom of expression if the harms are not clearly defined.

The policy brief proposes a detailed distinction between harmful but legal content and illegal content, and that illegal content should be met with sanctions including civil fines.

Tambini tackles the central legal and constitutional problem regarding a new code of conduct for legal harms such as political speech that interferes in the democratic process – so-called ‘fake news’. He finds that such censorship-like functions would not accord with the European Convention on Human Rights free speech test on proportionality, legality (parliamentary oversight), and necessity in a democratic society.

Therefore, Parliament must decide if new offences and categories of content require new laws and liabilities and set standards for blocking or filtering the most dangerous content. Given the dynamic nature of online harms, the process for introducing new laws to reflect harms should be more efficient and evidence-based, with advice from the new regulator.