Be it Covid-19 conspiracy theories shared in WhatsApp groups, campaigns of harassment by Twitter trolls, or the proliferation of far-right propaganda on YouTube, there is no doubt that harms perpetrated by extremists within the online world remain a pressing issue.
While the draft bill includes much that we welcome, we are increasingly concerned that this crucial bill is being dragged into the ongoing culture war in a way that could dangerously undermine its effectiveness. Vague and badly defined additions about protection of ‘democratic speech’ and comments by Oliver Dowden, Secretary of State for Digital, Culture, Media and Sport, about defences against so-called “woke campaigners”, have resulted in a draft bill that could, at best fail to fix the issues it sets out to, and at worse, actually open up a path for the re-platforming of hateful far-right extremists.
There are many great organisations scrutinising this bill from a variety of angles we care about. You should read the insights of Glitch, the Antisemitism Policy Trust and Demos amongst many others as the expertise of civil society organisations will be central to ensuring this legislation is shaped in a way that is effective and listens to the needs of effected communities. We will of course support their demands where we can and intend to continue to collaborate with a wide variety of organisations going forward.
However, our focus in on the online harm done by the far right, and that’s where this papers delves. With this in mind, we have gone through the draft bill and highlighted some of the potential gaps, problems and concerns and produced questions which need to be answered by the government. With the bill soon to face pre-legislative scrutiny via a committee we hope these questions will prove to be a constructive addition to the ongoing debate.
What is “democratically important” content?
Section 13 of the Draft Online Safety Bill outlines “Duties to protect content of democratic importance”:
(6) For the purposes of this section content is “content of democratic importance”, in relation to a user-to-user service, if— (a) the content is— (i) news publisher content in relation to that service, or (ii) regulated content in relation to that service; and (b) the content is or appears to be specifically intended to contribute to democratic political debate in the United Kingdom or a part or area of the United Kingdom.
The press release that accompanied the publication of the draft bill clarified that this will include: “content promoting or opposing government policy or a political party ahead of a vote in Parliament, election or referendum, or campaigning on a live political issue.”
While the aim of protecting speech of democratic importance is an admirable one, there is, at present, a lack of clarity around definitions, which opens this clause up for abuse by the far right.
Questions:
What is “Journalistic Content”?
Section 14 of the draft bill outlines “Duties to protect journalistic content” which includes “a dedicated and expedited complaints procedure available to a person who considers the content to be journalistic content.”
(8) For the purposes of this section content is “journalistic content”, in relation to a user-to-user service, if— (a) the content is— (i) news publisher content in relation to that service, or (ii) regulated content in relation to that service; (b) the content is generated for the purposes of journalism; and (c) the content is UK-linked.
In short, it seems that journalistic content is simply defined as content “generated for the purposes of journalism”.
The press release that accompanied the draft bill stated that “Articles by recognised news publishers shared on in-scope services will be exempted” and that:
This means they [Category 1 companies] will have to consider the importance of journalism when undertaking content moderation, have a fast-track appeals process for journalists’ removed content, and will be held to account by Ofcom for the arbitrary removal of journalistic content. Citizen journalists’ content will have the same protections as professional journalists’ content.
Questions:
Will this bill adequately deal with small but extreme platforms?
The Draft Bill states that as soon as reasonably practicable OFCOM must “establish a register of particular categories of regulated services”, splitting them into Category 1, 2A and 2B, each with threshold conditions.
This is extremely important as according to the Governments response to the White Paper, only Category 1 services “will additionally be required to take action in respect of content or activity on their services which is legal but harmful to adults.” The aim of this is to “mitigate the risk of disproportionate burdens on small businesses.”
Questions:
In November, we laid out the principles that HOPE not hate believes should underpin the Online Harms legislation. You can read our full briefing here.
A dismal turnout is one of several concerns facing leader Paul Golding after another poor national event, as we identify a puzzling discrepancy around the…