Social media firms could face big fines over abusive content under new laws - what you need to know
Social media firms could face big fines over abusive and harmful content online under new internet laws.
The Government claims its Online Safety Bill will help keep children safe online and combat racism and other abuse.
Big tech companies will be required to remove and limit harmful content with Ofcom given powers to block access to sites and issue large fines for the firms that fail to protect users.
The Government has also included a deferred power making senior managers at firms criminally liable for failing to follow a new duty of care, which could be introduced at a later date, while provisions to tackle online scams and protect freedom of expression have also been included.
Pressure to more strictly regulate internet companies has grown in recent years amid increasing incidents of online abuse.
A wide range of professional sports, athletes and organisations recently took part in a social media boycott in protest at alleged inaction by tech firms against online abuse.
What powers will Ofcom have?
As the new online regulator, Ofcom will be given the power to fine companies who fail to comply up to £18 million or 10% of their annual global turnover, whichever is higher – a figure which could run into billions of pounds for larger companies.
Ofcom will also have the power to block access to sites, the Government said.
The new rules, which are expected to be brought before Parliament in the coming months, are set to be the first major set of regulations for the internet anywhere in the world.
What the government has said
“Today the UK shows global leadership with our ground-breaking laws to usher in a new age of accountability for tech and bring fairness and accountability to the online world,” Digital Secretary Oliver Dowden said.
Writing in the Daily Telegraph, he added: “What does all of that mean in the real world? It means a 13-year-old will no longer be able to access pornographic images on Twitter. YouTube will be banned from recommending videos promoting terrorist ideologies.
“Criminal anti-semitic posts will need to be removed without delay, while platforms will have to stop the intolerable level of abuse that many women face in almost every single online setting.
“And, of course, this legislation will make sure the internet is not a safe space for horrors such as child sexual abuse or terrorism.”
New laws will also target online scams
As part of the new duty of care rules, the largest tech companies and platforms will not only be expected to take action against the most dangerous content, but also take action against content that is lawful but still harmful, such as that linked to suicide and self-harm and misinformation.
The Government said the deferred power to pursue criminal action against named senior managers would be introduced if tech companies fail to live up to their new responsibilities, with a review of the new rules set to take place two years after it is introduced.
The proposed laws will also target online scams, requiring online firms to take responsibility for fraudulent user-generated content, including financial fraud schemes such as romance scams or fake investment opportunities where people are tricked into sending money to fake identities or companies.