How will The Online Safety Bill Protect Adults and Children?

Social media companies will become more responsible for their users’ safety on their platforms thanks to The Online Safety Bill, which is a new set of laws to protect children and adults online.

How will the online safety bill protect children?

The Bill will make social media companies legally responsible for keeping children and young people safe online.

It will protect children by making social media platforms:

  • remove illegal content quickly or prevent it from appearing in the first place. This includes removing content promoting self-harm
  • prevent children from accessing harmful and age-inappropriate content
  • enforce age limits and age-checking measures
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

 

How will the online safety bill protect adults?

The Bill will protect adults in three ways through a ‘triple shield’.

Platforms will need to:

  • Remove all illegal content.
  • Remove content that is banned by their own terms and conditions.
  • Empower adult internet users with tools so that they can tailor the type of content they see and can avoid potentially harmful content if they do not want to see it on their feeds. Children will be automatically prevented from seeing this content without having to change any settings.

 

What types of content will be tackled?

Illegal content

Some content that children and adults encounter online is already illegal. The Bill will force social media platforms to remove all illegal content, stopping children and adults from seeing it.

The Bill is also bringing in new offences, including making content that promotes self harm illegal for the first time. Platforms will need to remove this.

This is not just about removing existing illegal content; it is also about stopping it from appearing at all. Platforms will need to think about how they design their sites to reduce the likelihood of them being used for criminal activity in the first place.

 

Illegal content that platforms will need to remove includes:

  • child sexual abuse
  • controlling or coercive behaviour
  • extreme sexual violence
  • fraud
  • hate crime
  • inciting violence
  • illegal immigration and people smuggling
  • promoting or facilitating suicide
  • promoting self-harm
  • revenge porn
  • selling illegal drugs or weapons
  • sexual exploitation
  • terrorism

 

Harmful content

Some content is not illegal but could be harmful or age-inappropriate for children. Platforms will need to prevent children from accessing it.

Harmful content that platforms will need to protect children from accessing will include:

  • pornographic content
  • online abuse, cyberbullying or online harassment
  • content that does not meet a criminal level but which promotes or glorifies suicide, self-harm or eating disorders

 

Underage children will be kept off social media platforms

 

The online safety laws will mean social media companies will have to keep underage children off their platforms.

Social media companies set the age limits on their platforms and many of them say children under 13 years of age are not allowed, but many younger children have accounts. This will stop.

Different technologies can be used to check people’s ages online. These are called age assurance or age verification technologies.

The new laws mean social media companies will have to say what technology they are using, if any, and show they are enforcing their age limits.

 

Adults will have more control over the content they see

The largest platforms will be required to offer adult users tools so they can have greater control over the kinds of content they see and who they engage with online. This includes giving them the option of filtering out unverified users, which will help stop anonymous trolls from contacting them.

The largest platforms will have to provide adult users with tools to help reduce the likelihood that they will encounter certain types of content that will be set out in the Bill. Examples include content that does not meet a criminal threshold but promotes or encourages eating disorders or self-harm, or is racist, anti-semitic or misogynistic.

The Bill will already protect children from seeing this content.

The tools must be effective and easy to access and could include human moderation, blocking content flagged by other internet users or sensitivity and warning screens.

The Bill will tackle repeat offenders

These laws will require all social media companies to assess how their platforms could allow abusers to create anonymous profiles, and take steps to ban repeat offenders, preventing them from creating new accounts and limiting what new or suspicious accounts can do.

 

Achieve Foundation
Grow-Develop-Achieve

 

How will the Online Safety Bill be enforced?

The government is putting Ofcom in charge as a regulator to check platforms are protecting their users.

Platforms will have to show they have processes in place to meet the requirements set out by the Bill. Ofcom will check how effective those processes are at protecting internet users from harm.

 

Ofcom will have powers to take action against companies which do not follow their new duties. Companies will be fined up to £18 million or 10 percent of their annual global turnover, whichever is greater. Criminal action will be taken against senior managers who fail to follow information requests from Ofcom.

In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers, and internet service providers to stop working with a site, preventing it from generating money or being accessed from the UK.

 

How will these UK laws impact international companies?

Ofcom will have the power to take appropriate action against all social media and tech companies, no matter where they are based, if they are accessible to UK users.

 

What are the next steps for the Online Safety Bill?

The new laws are currently going through the UK Parliament and will be in place once they finish their passage. 

Digital Partnerships

At Achieve Foundation, we know from experience that the best way to help digitally excluded people is to provide one to one support with trusted digital champions.

A major drawback to digital inclusion is the resourcing of this support in terms of financing and availability of skilled champions. We believe that this can only be solved through partnership and collaboration.

Partnership is an essential foundation for digital inclusion strategies as no single organisation can solve this issue alone.

Therefore, Achieve Foundation continues to work with a wide variety of partners to build local digital inclusion partnerships.

Get in touch to partner with us to end digital inequalities.

 

 

This information is licensed under the Open Government Licence v3.0. except where otherwise stated.

More news, events and special stories

Achieve Foundation

This website uses cookies to ensure you get the best experience.