FEATURE 

The Online Safety Bill & Specialist Publishers

The Online Safety Bill is a watershed moment in the regulation of tech platforms worldwide, bringing accountability to the actions of platforms. The success or failure of the legislation could have ramifications far beyond the UK, writes the PPA’s Seb Cuttill.

By Seb Cuttill

The Online Safety Bill & Specialist Publishers
Photograph: The Digital Artist on Pixabay.

The Bill places a "duty of care" on platforms to remove illegal content; content that is harmful to children, and content that is legal but harmful to adults. A parliamentary Joint Committee has scrutinised the draft Bill, with one of the questions being how to ensure that strong protections are put in place to preserve freedom of expression, including content produced by publishers.

The Government had previously promised that: “Content published by a news publisher on its own site […] will not be in scope of the regulatory framework and user comments on that content will be exempted”, further pledging that “legislation will include robust protections for journalistic content shared on in-scope services”.

However, the draft Bill defines the “news-related material”, that “recognised news publishers” must have the “principal purpose” of producing in a narrow way, focusing on news or information about “current affairs”. This would mean that content created by many IPSO regulated specialist publications in the UK would fall in scope of the Bill, despite already being subject to a robust regulatory regime.

There is a real danger that, when faced with high financial penalties or potential criminal sanctions for non-compliance, large tech platforms will implement indiscriminate algorithms to avoid sanctions. The unintended consequences of this move may mean that algorithms may not distinguish between professionally produced specialist content focusing on a sensitive topic, and malicious content. In its Report to Government, the Joint Committee referenced PPA’s concerns around the definitions and recommended that Government “consult with the relevant industry bodies to see how the exemption might be amended to cover this off, without creating loopholes in the legislation”.

There is a real danger that ... large tech platforms will implement indiscriminate algorithms to avoid sanctions.

Insufficient protection

Whilst broadening the protections in the Bill to encompass specialist publishers would be a major step forward, the protections themselves are, at present, insufficient to guarantee a watertight exemption. The Joint Committee agreed with this assessment, recommending news publisher content should not be moderated, restricted or removed unless the publication of such content is a clear criminal offense. This will be critical in guaranteeing that the public can access professionally researched content on a range of issues. Preventing the spread of harmful content is only the first stage in boosting media literacy: there must also be a diverse range of reliable content readily available.

Of further concern is the fact that the exemption for content does not encompass comments on publishers’ websites. Below-the-line comments are in theory outside the scope of the Bill via a general ‘limited functionality’ exemption, yet the inclusion of features with greater functionality, such as online forums, would bring a publisher’s website into scope of the Bill. It should be noted that IPSO already has regulatory oversight of moderated comments sections.

Features such as online forums often hold a great deal of value for readers, yet are often not a large driver of revenue, meaning publishers may be compelled to remove such features if the Bill is not amended to specifically exempt publishers’ websites. Perhaps even more concerning is that the “limited functionality” exemption can be repealed by the Secretary of State. Only an express exemption will prevent undue burdens being placed on publishers.

Even more concerning is that the “limited functionality” exemption can be repealed by the Secretary of State.

Regulating harmful advertising

Beyond questions around publisher content, another important development is the Joint Committee’s assertion that paid for advertising should be in-scope of the Bill, with Ofcom responsible for acting against service providers who consistently allow paid-for advertisements that create a risk of harm to be placed on their platform. However, the Joint Committee states that the Advertising Standards Authority should remain responsible for regulating the everyday content of adverts or the actions of advertisers.

Going forward, it will be fascinating to see how this recommendation interacts with the Government’s forthcoming consultation on its Online Advertising Programme (OAP). The OAP will focus on the harms associated with online advertising on platforms, looking at the regulatory model, responsible regulators, and their powers and funding.

As the Report acknowledges, the largest platforms have a commercial model based on selling and targeting advertising, with an incentive to maximise engagement and user attention to create more revenue. The online harms targeted by the Bill are a symptom of this commercial model, and it is the symptoms of these platforms’ market power that will be impacted by the Online Safety Bill.

To target this market power, the Government has created the Digital Markets Unit (DMU) to oversee a new regulatory regime for the most powerful digital firms, promoting greater competition and innovation. A Digital Competition Bill, expected in 2022, will give the DMU the power to write tailored Codes of Conduct for firms designated as having Strategic Market Status, and also make Pro-Competitive Interventions to address the root causes of market power.

It is hoped the new regime will rebalance the digital advertising market, with a positive externality being increased revenues for publishers. In relation to the Online Safety Bill, such a rebalancing may also reduce incentives for the largest service providers to amplify harmful content, with users given greater choice of platforms.

The online harms targeted by the Bill are a symptom of this commercial model.

Protecting specialist publishers

When designing and implementing these new approached to digital regulation, it is critical that the Government recognises the need to both protect and promote specialist publishers. Consumer magazine and business media publishers are an indispensable element of the UK’s media ecosystem, scrutinising issues that larger yet broader publishers would likely not cover. This is facilitated by the deep expertise in editorial teams, with Ofcom’s own research showing that magazines are more trusted than any other news medium in the UK.

Having the value of specialist publishers’ content recognised in the Online Safety Bill would be a key step, with the legislation expected to be passed in late 2022 or early 2023. Yet the path ahead remains fraught with challenges. This is inevitable, given the mammoth task of making regulation of the online sphere fit for the modern day. At each stage, it is imperative that 21st century publishers are given the space to innovate and grow in the new online environment. A universal appreciation of the immense value of specialist content will be central in realising this goal.

It is critical that the Government recognises the need to both protect and promote specialist publishers.