136.9 kb - PDF
This page provides guidance for sites and platforms on understanding how to protect the wellbeing of users. It is essential that all companies recognise self-harm and suicide as serious harms and ensure that policies are in place to protect users.
Establishing clear accountability and responsibility
Companies should ensure that accountability for all policies relating to the protection and safety of users is in place at a senior level.
Clear roles and responsibilities should be assigned to individual roles or teams to ensure that policies are well developed, implemented and reviewed. For larger companies this may be dedicated data protection, safeguarding or policy teams. For smaller sites and platforms, this may be an individual role or the platform manager who manages these responsibilities.
Understanding responsibilities to protect users
In order to protect online communities and safely manage self-harm and suicide content, companies must understand and adhere to:
- Data protection and electronic communication laws
- Their safeguarding responsibilities
- Reporting responsibilities
- Responsibilities to protect their users from online harms, such as the upcoming UK online harms regulatory framework
Understanding of these areas should be used to inform a clear and actionable policy for responding to self-harm and suicide content. See our information page: Developing self-harm and suicide content policies for more information.
Information for companies
Understanding data protection laws
All companies must provide accessible information for users about how they use and collect data. This is particularly important if it becomes necessary to share data with emergency services if a user is at imminent risk of harm, or to report illegal content hosted on the platform to law enforcement agencies.
Understanding safeguarding responsibilities
Companies must have a good understanding of safeguarding and how it applies to their site or platform, including when it is necessary to share information with emergency services, safeguarding bodies or law enforcement agencies. Having clear and robust procedures for managing safeguarding concerns will ensure that users are appropriately protected and that any safeguarding issues that do arise are managed promptly and effectively.
Users who may require additional safeguarding include:
- Children and young people under the age of 18.
- Those with special educational needs and disabilities.
- Those with physical or mental health problems.
Sites and platforms aimed at individuals within these groups will require additional safeguarding measures to ensure their safety.
Maintaining a record of safeguarding issues and how they are dealt with allows for monitoring of trends and recurring risks, making it easier to proactively address them.
Companies must understand their responsibilities to report content or safeguarding concerns to law enforcement agencies and safeguarding bodies, such as illegal self-harm and suicide content that is posted on their site.
Larger sites and platforms should work to have a twoway relationship with law enforcement and regulatory bodies by having clear channels for making requests, including in an emergency. This relationship could take the form of an allocated point of contact, or by proactively sharing information.
Download our information sheet on establishing accountability for the management of self-harm and suicide content: