
Digital Secretary Oliver Dowden and Home Secretary Priti Patel have announced the government’s final decisions on new laws to make the UK a safer place to be online with the most comprehensive approach yet to online regulation – ensuring the internet is a safer place for children and vulnerable users.
It sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users. The safety of children is at the heart of the measures. Social media sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The Government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal.
Tech platforms will need to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography. This will help make sure future generations enjoy the full benefits of the internet with better protections in place to reduce the risk of harm.
The most popular social media sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice.
Ofcom is now confirmed as the regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK.
The legislation includes provisions to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously - for example, if they do not respond fully, accurately and in a timely manner to information requests from Ofcom. This power would be introduced by Parliament via secondary legislation, and reserving the power to compel compliance follows similar approaches in other sectors such as financial services regulation.
Responding to the announcement, Richard said: "The Online Harms Bill, to be introduced next year, ensures that we enter a new age of accountability for tech, protecting children and vulnerable users, restoring trust in this industry, and enshrining in law safeguards for free speech. Social media sites, websites, apps and other services will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. Tech platforms will need to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography.
"As the regulator, Ofcom will have the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. The legislation also includes provisions to impose criminal sanctions on senior managers, meaning that tech companies must put public safety first or face the consequences. This proportionate new framework will ensure we don’t put unnecessary burdens on small businesses but give large digital businesses robust rules to follow."