The report on the UK’s draft Online Safety Bill was published on 14th December and contains four key recommendations to further strengthen this long-awaited piece of legislation. The Online Safety Bill has arisen from the failure of online platforms to self-regulate. It represents the first attempt to properly regulate social media companies, video sharing sites and search engines with one piece of legislation for the first time.
The Bill is the result of public policy and parliamentary process going back nearly half a decade. The draft Bill was published by the Government on 12 May 2021. It followed the Online Harms White Paper, published in April 2019 with the White Paper itself the result of a commitment made in the Internet Safety Strategy Green Paper, published in October 2017.
The Joint Committee on the Draft Online Safety Bill were appointed on 22 July 2021 to consider the draft Bill and recommend improvements to the government. Over five months they received over 200 submissions of written evidence and held oral evidence hearings with over 50 witnesses.
Defining the objectives of the Online Safety Bill
Research by the Department of Culture Media and Sport (DCMS) has shown that “80 per cent of six to 12 year-olds have experienced some kind of harmful content online”, whilst half of 13 to 17 year-olds believe they have seen something in the last three months that constitutes illegal content. So, it’s reassuring that the Bill aims to apply a child safety-focused framing to content moderation and online activities.
The Committee recommended the Bill should be restructured to set out its core objectives which should be that Ofcom should aim to improve online safety for UK citizens by ensuring that service providers:
- comply with UK law and do not endanger public health or national security;
- provide a higher level of protection for children than for adults;
- identify and mitigate the risk of reasonably foreseeable harm arising from the operation and design of their platforms;
- recognise and respond to the disproportionate level of harms experienced by people on the basis of protected characteristics;
- apply the overarching principle that systems should be safe by design whilst complying with the Bill;
- safeguard freedom of expression and privacy; and
- operate with transparency and accountability in respect of online safety.
There were four key recommendations from the Committee:
#1 What’s illegal offline should be regulated online
The Committee agreed that the criminal law should be the starting point for regulation of potentially harmful online activity, and that ‘safety by design’ is critical to reduce its prevalence and reach.
#2 Ofcom should issue binding Codes of Practice
The report recommended that Ofcom be required to issue a binding Code of Practice to assist online providers in identifying, reporting on and acting on illegal content, in addition to those on terrorism and child sexual exploitation and abuse content.
For those worried about how the Online Safety Bill could potentially infringe on freedom of speech online, the Committee pointed out that, as a public body, Ofcom’s Code of Practice will need to comply with human rights legislation (currently being reviewed by the government) and this will provide an additional safeguard for freedom of expression.
#3 Keep children safe from accessing pornography
The Committee recommends that all pornographic websites should have to prevent children from accessing their content as they present a threat to children both by allowing them access and by hosting illegal videos of extreme content.
#4 New criminal offences are needed for new harmful online activities
The Committee endorsed the Law Commission’s recommendations for the creation of new criminal offences to cover harmful activities that have sprung up online such as cyberflashing, encouraging someone to commit self-harm, and paid-for ads for financial scams. They further recommended that the Government brings in the Law Commission’s proposed Communications and Hate Crime offences together with the Online Safety Bill.
The buck stops at the Big Tech boards
The report calls for tech companies to appoint someone at board level who will be designated the company’s “safety controller” and will be liable for a new criminal offence: failing to deal with “repeated and systemic failings that result in a significant risk of serious harm to users”. Under the bill, senior managers face a fine or up to two years in jail if they fail to comply with “information requests” from Ofcom.
What happens to the Online Safety Bill now?
The Government now has two months to respond to the Joint Committee’s report and then the two Houses of Parliament will have the opportunity to debate the Bill before it moves into law, projected now to be late 2022.
For more about how the digital world is changing the way we live, learn and love, and what we can do to mitigate some of the harms, pick up a copy of my new book.