After a long period of waiting, debating and re-drafting, the Online Safety Act 2023 has finally made its way into UK law. For those businesses caught by its scope, it’s a big set of responsibilities and changes.
Rather than debate the merits of the Act and whether the obligations and duties it imposes are proportionate to its overall aim, the time has come for business to understand, in practical and actionable terms, what they need to do to ensure that they don’t fall foul of the new legislation.
1. Does it apply to you?
The first thing to determine is whether you offer a service which falls within the scope of the OSA. The OSA covers two types of services:
- user-to-user services – i.e. an internet service where content is generated directly by users, and/or where content is uploaded to or shared on the service by users, with that content then becoming available to other users of the service (e.g. social media platforms, forums, wiki pages, and all other forms of network based or ‘collaborative’ content platforms).
- search services – i.e. an internet service that allows users to search more than one website or database for particular content (e.g. a search engine, be that a general tool such as Google or Bing, or a more specialised service that aggregates content from a smaller range of sites for a more focused type of search).
Before breathing a sigh of relief too quickly, recognise that these are broad definitions, and capture commonplace features such as chat rooms or bulletin boards that may feature on sites/platforms where that functionality isn’t the core service. So have a look through your website and determine whether you do, in fact, have any such features. If yes, read on.
If you do find that your business – or a particular bit of functionality offered on your website/platform – is caught by the scope of the OSA, it doesn’t mean that it is time to panic just yet. The OSA is designed to impose the most significant obligations on the companies with the most significant reach. We await secondary legislation to specify exactly which businesses fall within Category 1, but for now we know that Category 1 is reserved for the highest risk, highest reaching user to user platforms. SMB predicts that this will be companies such as Meta, Tik Tok and X (that prediction does not require a crystal ball). Category 2A will cover companies with the highest reach search services (Google and Bing, etc) and category 2B will cover other platforms with high risk functionality (i.e. all of the other user to user services).
What that means is that most companies which offer functionality that falls into the categories described above will fall into category 2B. This means that while companies offering the above services are within the scope of the OSA, they will only be obliged to comply with its lower tier of obligations.
2. Appoint a responsible person.
Appoint a member of staff to be responsible for understanding the legislation and for driving any changes that will be required to ensure the business is compliant. The responsible person’s responsibilities will include facilitating training of other staff members, carrying out risk assessments with input from any other relevant members of staff (see point three below for further information), monitoring the guidance and secondary legislation which continues to be published and, depending on the size and risk profile of your business, responsible for reviewing content and/or managing the complaints procedures (see points 4 and 5 below). This should be a senior member of staff, and the role should be given considerable weight and importance – responsible members of staff can be found personally liable if their business is found to be in breach of the OSA.
3. Carry out a Risk Assessment.
Once you’ve determined that your business is in scope and you have appointed someone to be responsible for compliance, this individual should take proactive action to understand your company’s risk profile. The OSA sets out 15 categories of harm that it is seeking to prevent, so you should consider whether your service is likely to facilitate the creating or sharing of any content which falls within those 15 categories. You should also consider who your user base is, and whether they are likely to be children (which for the purposes of the OSA is anyone under the age of 18) or vulnerable adults. If so, your risk profile will be increased.
Their guidance remains a work in progress, but Ofcom have published initial guidance on how to carry out these risk assessments which include following these four steps:
- Understand the harms.
- Assess the risk of harm.
- Decide measures, implement and record.
- Report, review and update risk assessments.
The process is analogous to carrying out a data protection impact assessment, as required under the UK GDPR. Businesses should, in the first instance, carry out a review of what is happening and set up a system for revisiting that conversation at a frequency which is proportionate to the risk of harm identified.
For further information, please see: https://www.ofcom.org.uk/online-safety/information-for-industry/guide-for-services/risk-assessments
4. Establish a mechanism and procedure for reporting and removing content.
Given you have a legal duty to take down harmful content, it is critical that you have a procedure for identifying harmful content. This should be both an internal mechanism for regular reviews of content which is available on your site (which can be carried out by software or by giving someone within your organisation the duty of conducting regular reviews of the website) and a public facing mechanism which allows visitors to your website to easily and quickly flag content to you which they believe to be harmful.
However, you need to keep in mind that reported content should not automatically be removed. Users of your service have a right to bring a claim against you in the event that you remove their content or ban them from your service for alleged sharing of harmful content where it transpires that there was, in fact, no actual breach. This right exists for users of services to deter companies from simply having a procedure which means any reported content is immediately removed without consideration.
That sanction is designed to protect the fundamental principle of freedom of expression (against the background of an Act which is otherwise largely concerned with policing and regulating speech) and creates an undeniably awkward tension for affected businesses. In practice, the tension created by the two duties creates a responsibility for removing content cannot be delegated to software entirely, nor one that can take a blanket approach of always saying Yes or always saying No. To satisfy it a business will most likely need a human input, and where content is valuable (or the businesses reputation in relation to content is valuable) that will need to be one who is familiar with the Act to have final responsibility for pieces of content which are nuanced or sensitive.
Depending on the risk profile you identified for your business when carrying out a Risk Assessment, you may decide that the responsible person can cover this role alone, or you may need to consider whether this is a function which is likely to be in high demand and will require a dedicated individual or team.
5. Establish a complaints procedure.
You will also need to establish a mechanism by which users can make complaints that the business has not removed unlawful content, has not responded to reported content, or has breached the freedom of expression and privacy of an individual by removing their content unfairly, or where the business has taken steps in relation to content which unfairly mean that content relating to that person no longer appears in search results or is given a lower priority in search results.
Technically, this should be straightforward enough to ask your web designers to add this functionality to the website. Behind the scenes is where the difficulties lie. As set out at point 5 above, the extent to which you will need a dedicated team to be responsible for handling these complaints, or whether your responsible person can absorb this role depends entirely on your risk profile and the volume of content which you process. Businesses should establish a procedure based on the risk profile identified, and be prepared to respond accordingly to scale up this role in the event that more complaints are received than expected.
6. Update your website or user terms and conditions.
Terms and conditions should be updated so that users of your service are aware that you have identified that there is a risk of harm for users who use your services, and that you will take proactive steps to remove any content which is harmful. It should be expressly stated within your terms and conditions that where users share content, it must not be harmful.
As set out in point 4, you should also highlight that users have a right to bring a claim against you if their content is removed where it was not, in fact, harmful.
You may choose to incorporate your Complaints policy (see point 5) into your terms, or make reference to them in these terms if you want to have the complaints policy as a standalone policy.
7. Be prepared to re-think.
The new legislation has lurched into reality, but we are still in the dark about large swaths of the legislation and how it will work. Much of the Act’s fine detail, and regulatory expectation, is due to be filled in by Ofcom as it issues guidance over the course of this year. Businesses will need to take proactive steps to comply, but will also need to be prepared to revisit these conversations until a routine has been established which enables you to embed the duties set out in the new legislation into the day to day running of your business without becoming commercially unviable.