Online Safety Bill - February 2023

Dear Constituent,

Thank you for contacting me about the Online Safety Bill.

I welcome the Government's plans to make the UK the safest place to go online. The Online Safety Bill will give adults control over what they see and engage with online. It will also ensure that children are protected, by allowing parents to see and act on the dangers sites pose to young people.

On 5 December 2022, MPs agreed to re-commit certain Clauses and Schedules of the Online Safety Bill to a Public Bill Committee. This took place in mid-December 2022 and the Bill completed its stages in the House of Commons on 17 January 2023 and it will now progress to the House of Lords. In that time, I have been contacted by constituents who have raised concerns about various aspects of the Bill and I have decided to set out my thoughts on each concern below:

Misinformation and Criminal Offences

While the Government has changed some measures in the Bill, I am reassured that the Government has confirmed that the new offences of false and threatening communications will remain in the Bill. The false communications offence will protect individuals from any communications where the sender intended to cause harm by sending something knowingly false, while the threatening communications offence will capture communications which convey a threat of serious harm, such as grievous bodily harm or rape.

Although there is an existing offence in the Communications Act that captures knowingly false communications, the introduction of new offences will raise the current threshold of criminality. As the internet becomes an increasingly integral part of our daily lives, I believe it is imperative that action is undertaken to ensure users are protected online and I therefore welcome these new measures.

Alongside the Online Safety Bill, which will require in-scope companies to remove and limit the spread of illegal content, the Beating Crime Plan sees ministers work across Government to deliver improvements to cyber resilience and ensure we stay ahead of cyber criminals. The Government has invested £195 million over the last five years to establish a specialist cyber law enforcement network to disrupt and prosecute cyber criminals and support victims in response and recovery. I am also encouraged that the new National Cyber Strategy strengthens law enforcement response to cybercrime and drives greater collaboration with the National Cyber Security Centre and the National Cyber Force - National Cyber Strategy 2022 - GOV.UK (www.gov.uk)

Protections for Journalists and Journalism

When introduced to Parliament for its Second Reading, the Online Safety Bill would not stop platforms from removing news publishers’ content or making it less visible if they decided to review it for potential breaches of their terms and conditions, even if they eventually found no fault with it. The Government has amended the Bill to guard against the arbitrary removal of articles from journalists at recognised news outlets when shared on social media platforms. News content has been removed or made less visible by social media moderators or algorithms for unclear reasons, often at the height of news cycles. For example, last year YouTube suddenly removed TalkRadio’s channel then reinstated it 12 hours later, admitting the move had been a mistake.

Therefore, this amendment to the Bill will help to address this situation and add an extra layer of protection to the safeguards already written into the Bill for online journalism. As a result, Category 1 companies, which includes the largest and most popular social media platforms, will be required to ensure recognised news publishers’ articles remain viewable and accessible on their sites even if they are under review by moderators. They will also be required to notify news publishers and offer them a right of appeal before removing or moderating their content or taking any action against their accounts.

Legal but Harmful content

The Online Safety Bill has three main aims: strengthen the protections for children; ensure that adults' right to legal free speech is protected; and create a system of transparency, accountability, and control to give the British public more choice and power over their own accounts and experience. In its earlier form, the Bill would simply have granted too much control over the content we consume online to anonymous committees in Silicon Valley, rather than empowering individuals to make their own decisions.

The Bill’s key objective, above everything else, is the safety of young people online. To this end, Ministers have made a series of amendments that will further enhance the existing protections for children contained within the legislation. These include requiring platforms to clearly explain in their terms of service the measures they use to enforce age verification where a minimum age is specified and publish summaries of their risk assessments for illegal content and material that is harmful to children. The Children's Commissioner will also be named as a statutory consultee for Ofcom in its development of the codes of practice to ensure that the measures relating to children are robust and reflect the concerns of parents.

Many have been particularly concerned about provisions that would result in the over-removal of legitimate legal content by creating a new category of ‘legal but harmful’ speech. However admirable the goal, I do not believe that it is morally right to censor speech online that is legal to say in person. Ministers have therefore quite rightly removed the 'legal but harmful' provisions from the Bill in relation to adults, and replaced it with a fairer, simpler, and more effective mechanism called the Triple Shield. This will focus on user choice, consumer rights and accountability whilst protecting freedom of expression.

Under the Triple Shield, three important rules apply: content that is illegal should be removed; legal content that a platform prohibits in its own terms of service should be removed, and legal content that a platform allows in its terms of service should not be removed; and adults should be empowered to choose whether or not to engage with legal forms of abuse and hatred if the platform they are using allows such content. The 'Third Shield' therefore puts a duty on platforms to provide their users with the functionality to control their exposure to unsolicited content. Crucially, these functions will, under no circumstances, limit discussion, robust debate, or support groups’ ability to speak about issues freely.

These changes will ensure the Bill protects free speech whilst holding social media companies to account for their promises to users, guaranteeing that users will be able to make informed choices about the services they use and the interactions they have on those sites.

Private Communications

It is important that the Bill defends freedom of expression. The Government has therefore quite rightly removed the 'legal but harmful' provisions, which could have inadvertently stifled free speech, and replaced them with new duties which strengthen the requirements for major platforms to adhere to their terms and conditions about content moderation. Platforms will be required to clearly articulate in their terms of service what they are doing to enforce age requirements on their sites, such as the use of age verification technology.

The Bill will introduce several new offences, such as cyberflashing, intimate image abuse, encouragement of self-harm, false communications and threatening communications.

As part of the new duties to protect adults, in scope companies will need to take preventative measures to tackle illegal content or activity. Furthermore, platforms will be required to have appropriate systems and processes in place to remove content that is banned by their terms and conditions. These duties will mean that users can make informed decisions on which sites they choose to use, and know what they can expect online. In addition, platforms will need to give users who would like to minimise their engagement with particular categories of content tools to reduce the likelihood that they will see that content, such as suicide content, content promoting self-harm or eating disorders, or content that is abusive or incites hate.

Ofcom will also be given the power to make platforms publish details of any enforcement notices they receive from the regulator for breaching their safety duties under the Bill.

The UK supports the responsible use of strong encryption, as the safety and security of digital technologies is essential. However, if end-to-end encryption is implemented in a way which intentionally blinds tech companies to content on their platforms it can have a disastrous impact on public safety. All parts of regulated platforms, including instant messaging services and closed social media groups, are in scope of the Online Safety Bill. Companies cannot use encryption as an excuse to avoid protecting their users, particularly children.

Protections for Women and Girls

Online abuse directed towards women and girls is entirely unacceptable. No one should have to worry about going online for fear of ill-treatment or harm. I therefore welcome that the Online Safety Bill puts in place the regulatory framework to tackle online abuse and protect vulnerable individuals.

Amendments due to be tabled by the Government will list controlling or coercive behaviour as a priority offence, which will mean that companies have to take proactive measures to tackle this content, and name the Victims' Commissioner and Domestic Abuse Commissioner as statutory consultees for the codes of practice, to ensure that they are consulted by Ofcom ahead of drafting and amending the codes of practice.

Separately, the Government is bringing forward reforms to the criminal law on the abuse of intimate images. Building on the recommendations from the Law Commission, the Online Safety Bill will criminalise the sharing of people's intimate images without their consent. This, in combination with the measures already in the Bill to make cyber-flashing a criminal offence, will significantly strengthen protections for women in particular as they are disproportionately affected by these activities.

Online Sexual Abuse of Children

I welcome the fact that the Government has tabled an amendment to the Bill which will give Ofcom extra tools to ensure technology companies take action to prevent, identify and remove harmful CSAE content. As a result, Ofcom will be able to demand that technology companies, such as social media platforms, roll out or develop new technologies to better detect and tackle harmful content on their platforms.

The amendment will support innovation and the development of safety technologies across the technology industry and will incentivise companies in building solutions to tackle CSEA which are effective and proportionate.

In conjunction with the legislation, I welcome efforts to improve technology to tackle online child sexual abuse. For example, GCHQ is collaborating with the tech industry to identify and develop solutions to crack down on large scale online child sexual abuse and the government-funded Safety Tech Challenge Fund is demonstrating that is it is possible to detect child sexual abuse material in end-to-end encrypted environments, while respecting user privacy.

Pornography

The Online Safety Bill will ensure providers that publish or place pornographic content on their services have a legal duty to prevent children from accessing that harmful content. To this end, robust checks must be implemented to ensure their users are 18 years old or over, including by using age verification technologies. If sites fail to act, the independent regulator Ofcom will be able to issue fines of up to £18 million or 10 per cent of qualifying worldwide revenue, whichever is greater, or can block them from being accessible in the UK.

All websites that display pornography fall into the scope of the Government’s pioneering new internet safety laws, capturing commercial providers of pornography as well as sites that allow user-generated content. These measures offer greater protections for children than would have been covered by the narrower scope of the Digital Economy Act and extend to social media companies too, where a considerable quantity of pornographic material is accessible.

The legislation also contains provisions that require companies to report child sexual exploitation and abuse content identified on their services. This will ensure companies provide law enforcement with the high-quality information they need to safeguard victims and investigate offenders.

More broadly, companies will need to consider the risks their sites may pose to the youngest and most vulnerable people and act to protect children from inappropriate content and harmful activity.

Thank you again for taking the time to contact me.

Sincerely,

Richard