Menu Close

UK’s Draft online safety bill raises serious concerns around freedom of expression

Draft Online Safety Bill: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/985033/Draft_Online_Safety_Bill_Bookmarked.pdf

On May 12, the UK government published a draft of its Online Safety Bill, which attempts to tackle illegal and otherwise harmful content online by placing a duty of care on online platforms to protect their users from such content. The move came as no surprise: over the past several years, UK government officials have expressed concerns that online services have not been doing enough to tackle illegal content, particularly child sexual abuse material (commonly known as CSAM) and unlawful terrorist and extremist content (TVEC), as well as content the government has deemed lawful but “harmful.” The new Online Safety Bill also builds upon the government’s earlier proposals to establish a duty of care for online providers laid out in its April 2019 White Paper and its December 2020 response to a consultation.

EFF and OTI submitted joint comments as part of that consultation on the Online Harms White Paper in July 2019, pushing the government to safeguard free expression as it explored developing new rules for online content. Our views have not changed: while EFF and OTI believe it is critical that companies increase the safety of users on the internet, the recently released draft bill reflects serious threats to freedom of expression online, and must be revised. In addition, although the draft features some notable transparency provisions, these could be expanded to promote meaningful accountability around how platforms moderate online content.

Our Views Have Not Changed: Broad and Vague Notion of Harmful Content

The bill is broad in scope, covering not only “user-to-user services” (companies that enable users to generate, upload, and share content with other users), but also search engine providers. The new statutory duty of care will be overseen by the UK Office of Communications (OFCOM), which has the power to issue high fines and to block access to sites. Among the core issues that will determine the bill’s impact on freedom of speech is the concept of “harmful content.” The draft bill opts for a broad and vague notion of harmful content that could reasonably, from the perspective of the provider, have a “significant adverse physical or psychological impact” on users. The great subjectivity involved in complying with the duty of care poses a risk of overbroad removal of speech and inconsistent content moderation.

In terms of illegal content, “Illegal content duties” comprise the obligations of platform operators to minimize the presence of so-called “priority illegal content,” to be defined through future regulation, and a requirement to take down any illegal content upon becoming aware of it. The draft bill thus departs from the EU’s e-Commerce Directive (and the proposed Digital Services Act), which abstained from imposing affirmative removal obligations on platforms. For the question of what constitutes illegal content, platforms are put first in line as arbiters of speech: content is deemed illegal if the service provider has “reasonable grounds” to believe that the content in question constitutes a relevant criminal offence.

The bill also places undue burden on smaller platforms, raising significant concerns that it could erode competition in the online market. Although the bill distinguishes between large platforms (“Category 1”) and smaller platforms (“Category 2”) when apportioning responsibilities, it does not include clear criteria for how a platform would be categorized. Rather, the bill provides that the Secretary of State will decide how a platform is categorized. Without clear criteria, smaller platforms could be miscategorized and required to meet the bill’s more granular transparency and accountability standards. While all platforms should strive to provide adequate and meaningful transparency to their users, it is also important to recognize that certain accountability processes require a significant amount of resources and labor, and platforms that have large user bases do not necessarily also have access to corresponding resources. Platforms that are miscategorized as larger platforms may not have the resources to meet more stringent requirements or pay the corresponding fines, putting them at a significant disadvantage. The UK government should therefore provide greater clarity around how platforms would be categorized for the purposes of the draft bill, to provide companies sufficient notice of their responsibilities.

Lastly, the draft bill contains some notable transparency and accountability provisions. For example, it requires providers to issue annual transparency reports using guidance provided by OFCOM. In addition, the bill seeks to respond to previous concerns around freedom of expression online by requiring platforms to conduct risk assessments around their moderation of illegal content, and it requires OFCOM to also issue a transparency report which summarizes insights and best practices garnered from company transparency reports. These are good first steps, especially considering the fact that governments are increasingly using legal channels to request that companies remove harmful and illegal content.

However, it is important for the UK government to recognize that a one-size-fits-all approach to transparency reporting does not work, and often prevents companies from highlighting trends and data points that are most relevant to the subject at hand. In addition, the structure of the OFCOM transparency report suggests that it would mostly summarize insights, rather than provide accountability around how internet platforms and governments work together to moderate content online. Further, the draft bill does not significantly incorporate features such as providing users with notice and appeals process for content decisions, despite robust advocacy by content moderation and freedom of expression experts. Adequate notice and appeals are integral to ensuring that companies are providing transparency and accountability around their content moderation efforts, and are key components of the Santa Clara Principles for Transparency and Accountability in Content Moderation, of which EFF and OTI were among the original drafters and endorsers.

UK government Should Revise the Draft Bill To Protect Freedom of Speech

As social media platforms continue to play an integral role in information sharing and communications globally, governments around the world are taking steps to push companies to remove illegal and harmful content. The newly released version of the UK Government’s Online Safety Bill is the latest example of this, and it could have a significant impact in the UK and beyond. While well intended, the bill raises some serious concerns around freedom of expression online, and it could do more to promote responsible and meaningful transparency and accountability. We strongly encourage the UK government to revise the current draft of the bill to better protect freedom of speech and more meaningfully promote transparency.

Article: UK’s Draft Online Safety Bill Raises Serious Concerns Around Freedom of Expression

Leave a Reply

Your email address will not be published. Required fields are marked *