Social media in modern society

Regulation of Digital Media Platforms: The case of China

WeChat

In this policy brief, Dr Jufang Wang reviews China's regulation of digital media platforms against the backdrop of the party-state's concerns that the platforms' increasing power as gatekeepers of online news and information may undermine its information control.

Dr Wang examines the adjustment of China’s online content regulatory framework from targeting individual speakers and publishers to targeting digital platforms, which are required to fulfil a series of content governance obligations, including the real-name registration and verification policy, conducting real-time content monitoring, and establishing a user blacklisting mechanism.

Her analysis covers the rationale behind the severe punishments handed down to private platforms for hosting problematic content deemed to be politically sensitive, vulgar, or disinformation, as well as the efforts of the CCP to ensure greater control of the editorial decision-making of these digital media companies at the boardroom level.

She considers whether the so-called ‘special management share’ initiative may be applied to larger platforms such as WeChat and Weibo, and discusses the implications for their global reputation and possible conflicts arising from their obligations under international jurisdictions as publicly listed companies in overseas stock markets.

Dr Jufang Wang is deputy director of the Platforms, Governance, and Global Society (PGG) programme in the Law, Justice and Society Research Cluster at Wolfson College, University of Oxford. Previously, Jufang was Vice Director of News at CRI Online (part of China Media Group, China’s equivalent of the BBC), and has been an academic visitor at the BBC (2011) and at the Centre for Socio-Legal Studies, University of Oxford (2014). 
 

Governance of Public Opinion in the Age of Platforms: A Study of China

Jufang Wang, a former news editor in China and academic visitor at the BBC and Oxford University, offers insights into China’s news transformation and Internet governance in the platform age. She argues that the Chinese state has adjusted its Internet regulatory framework to target major digital media platforms such as WeChat, Weibo and Toutiao and requires them to take the “main responsibilites” in governing their sites. Such a new approach leads to what she calls “state governance through platforms”. 

Reducing Online Harms through a Differentiated Duty of Care: A Response to the Online Harms White Paper

LSE media expert and Government adviser Damian Tambini argues that social media companies have a 'duty of care' to protect users from harms caused by content published on their platforms, in response to the government's policy proposals in its White Paper on Online Harms.

He argues that the government is correct to propose a new institution, Ofweb, with the power to regulate online content in order to combat the significant harms caused by hate speech, foreign interference in democracy, images of self-harm, and terrorist content online. Yet he also warns of the potential dangers in the approach of the White Paper, which could inhibit freedom of expression if the harms are not clearly defined.

The policy brief proposes a detailed distinction between harmful but legal content and illegal content, and that illegal content should be met with sanctions including civil fines.

Tambini tackles the central legal and constitutional problem regarding a new code of conduct for legal harms such as political speech that interferes in the democratic process – so-called ‘fake news’. He finds that such censorship-like functions would not accord with the European Convention on Human Rights free speech test on proportionality, legality (parliamentary oversight), and necessity in a democratic society.

Therefore, Parliament must decide if new offences and categories of content require new laws and liabilities and set standards for blocking or filtering the most dangerous content. Given the dynamic nature of online harms, the process for introducing new laws to reflect harms should be more efficient and evidence-based, with advice from the new regulator. 

Journalists, academics, and government advisers debate how best to regulate online speech

How do we prevent harms caused by online content, and should the state, or internet giants like Facebook and Twitter be primarily responsible for regulating content published on their platforms?

This was the central question addressed by the Foundation for Law, Justice and Society at a workshop at Wolfson College last week to assess in what ways online speech is and should be governed in China and the West.

Free Speech: Ten Principles for a Connected World

In this keynote lecture, leading political writer Timothy Garton Ash presents his ten guiding principles for a connected world, and offer a manifesto for global free speech in the digital age.

Drawing on a lifetime of writing about dictatorships and dissidents, Oxford Professor of European Studies Timothy Garton Ash argues that we are currently experiencing an unprecedented era in human history for freedom of expression.