Ofcom will soon regulate Facebook, Twitter and other social media platforms - here's what it means

Communications watchdog Ofcom will be given new powers to hold social media companies accountable for hosting harmful content on their platforms, it has been announced.

The legislation, which is reportedly still being drafted, will place a legal “duty of care” on social media companies such as Facebook, Twitter, YouTube, Snapchat, Instagram, and TikTok to ensure users are protected from illegal or harmful content.

Hide Ad
Hide Ad

Until now, these companies have been self regulating and have defended their abilities to take down unacceptable content, but critics say independent rules are needed to keep people safe.

The harmful impact of social media

Concerns about the impact of social media on vulnerable people come amid suicides, such as that of schoolgirl Molly Russell in 2017.

The 14 year old was found to have viewed harmful content on Instagram, prompting the platform to ban all images of self-harm or suicide.

The government is due to officially announce the new powers for Ofcom (which currently only regulates the media, not internet safety) on Wednesday (12 Feb), as part of its plans for a new legal duty of care.

Aim to minimise risks

Hide Ad
Hide Ad

Ofcom will now have the power to remove harmful content such as violence, terrorism, cyber-bullying and child abuse although it is unclear what penalties the regulator will be able to enforce.

They will also be expected to "minimise the risks" of it appearing at all.

Jonathan Oxley, interim Chief Executive of Ofcom said, “We share the Government’s ambition to keep people safe online and welcome that it is minded to appoint Ofcom as the online harms regulator.

“We will work with the Government to help ensure that regulation provides effective protection for people online and, if appointed, will consider what voluntary steps can be taken in advance of legislation.”