This week the Government published its long-awaited White Paper on “Online Harms” which contains proposals intended to reduce harmful content on the internet.
The key proposals include the establishing of a new duty of care to legally oblige tech firms to take steps to protect their users. Compliance with this duty will be overseen by a new independent regulator and will be set out in new codes of practice. Tech companies that do not fulfil their duty of care could be fined, their senior managers held criminally liable or their website blocked entirely.
These measures are intended to tackle online material related to self-harm and suicide as well as terrorism, child abuse, revenge pornography, harassment and hate crimes. It also covers harmful behaviour such as cyber-bullying, trolling and the spread of fake news and disinformation.
The new regulatory framework will apply to social media platforms, messaging services, search engines, public discussion forums and file hosting sites.
Launching the White Paper on Online Harms, the Home Secretary warned of tech companies “For too long they have failed to go far enough and fast enough to help keep our children safe. They have failed to do the right thing – for their users, for our families, and for the whole of society … I’m giving tech companies a message they cannot ignore… It’s time for you to protect the users and give them the protection they deserve, and I will accept nothing else”.
The Government are consulting on various aspects of their plans for regulation including the options for appointing an independent regulatory body and its enforcement powers. The public consultation will be open from 8 April to 1 July 2019.
Background
The White Paper has been published following growing concerns over the damaging content that is available online.
Ofcom, the communications sector regulator, found last autumn that 45% of internet users suffer harm from content that they view online. From trolling and bullying, theft of data or personal information, unwelcome attention via social media to hate speech, harassment or illegal sexual content, the dangers of the internet can be far-reaching.
Of particular concern is the impact of social media use on young people, including their ability to access with apparent ease images that appear to promote and glorify suicide and self-harm. Following the tragic death of a British teenager, and the revelation from other parents that social media played a role in their children’s suicides, there has been increasing momentum amongst the public and politicians to hold social media companies to account for the harmful content that they host.
The current position
Currently most online and social media sites are unregulated and are subject only to general legal requirements. This means that social media networks, video sharing platforms, search engines, online messaging services and nearly all other online services do not have to adhere to standards to protect users. In contrast Ofcom regulates television programmes and online video-on-demand services with a standards-based approach, as set out in the Ofcom Broadcasting Code. These standards include the protection of people under 18, protection from harmful or offensive material and material likely to incite crime and disorder as well as privacy rights.
NetRights welcomes the Government’s proposals to improve online safety.
Concerns have been expressed by some critics that the breadth of these new proposals threaten freedom of speech, limiting the online content that can be accessed by the public. However, we consider that the codes of practice can be carefully drafted to balance the rights of freedom of expression with the rights of individuals. As with the Ofcom Broadcasting Code, standards on harm and offence can be required to be set in a way which best guarantees freedom of expression.
Commenting on the White Paper, Head of NetRights, Laura Baglow said: “The current legal framework is fragmented and does not sufficiently address harmful online content, and in particular its ill effects on children and vulnerable groups. There is a gap to be filled by standards based independent regulation of social media sites. Independent regulation, with carefully drafted codes of practice, is needed to address the many dangers of the internet and social media. If the many benefits of social media and the internet are to be fully maximised in our society then it is essential to ensure that there is no space online for harm and abuse.”
The White Paper on Online Harms can be read in full here
Laura Baglow is head of NetRights, the Social Media, Internet and Media law department of Parnalls Solicitors. For legal advice and assistance with social media or internet postings please contact enquiries@netrights.co.uk or telephone 01566 772375. Find out more about NetRights here