UK plans to regulate the internet won't make much difference at all

The UK government has announced plans to regulate online platforms like Facebook and YouTube, aiming to ensure they are monitoring and removing harmful and illegal content, such as images of child abuse or material that promotes terrorism. It intends to appoint Ofcom – which currently oversees complaints about broadcast television – as the regulator.

 

The government is keen to appear tough with its new “online harms” regime. Speaking today, UK culture secretary Nicky Morgan said the new rules would be “an incredible opportunity to lead the world in building a thriving digital economy, driven by ground-breaking technology, that is trusted by and protects everyone in the UK.”

 

But looking below the surface, the announcement is somewhat of a damp squib, providing little more than an official stamp of approval to actions those social networks already take

Under pressure 

 

These online platforms have been under pressure to clean up for some years now. Both the threat of regulation and media pressure, including reporting by New Scientist on the radicalizing effects of YouTube’s algorithm, has compelled the sites to take action.

 

And they have done so, on a scale that Ofcom couldn’t possibly manage. The UK media regulator employs just under 1000 people, while Google, the parent company of YouTube, uses 10,000 contracted content moderators to weigh up whether physical breaches laws or site guidelines. Facebook pledges 35,000 to do the same.

 

Between July and September 2019, YouTube removed more than 3 million channels, more than 55 million videos, and nearly 520 million comments from its platform for not meeting its terms and conditions. In the same three months, Facebook took action on 92 million pieces of content that contained adult content or nudity on its platform, much of which is found before users reported it.

 

 

 

Unknown workings 

Despite this, the systems put in place by the platforms aren’t perfect, often overlooking some content or incorrectly flagging up false positives that are later rescinded.

 

It is unclear what Ofcom will add to this. The organization itself is unsure how the moderation regime will work and are keen to point out any plans are at an early stage.

 

Jonathan Oxley, the regulator’s interim chief executive, says Ofcom will work with the government “to help ensure that regulation provides effective protection for people online and, if appointed, will consider what voluntary steps can be taken in advance of legislation.”

 

 

More details are due in the coming months, including potential announcements about how Ofcom will staff up to meet the demands. The organization admits the volume of content on the internet is on a different scale to broadcast television.

 

Because of that, both Ofcom and the government have said that the onus to police content would remain with the platforms. Under the mooted plans, people couldn’t directly complain about an individual piece of content to Ofcom. In that sense, nothing has changed. And for that reason, it is unsurprising that YouTube and Facebook tell New Scientist they welcome the chance to co-operate with the government.

 

The regulator would be able to haul up the platforms over any issues, with the government hinting at the potential for fines and liability for directors – but YouTube announced last week that it earned $15 billion from adverts alone a previous year so that any penalty will be a drop in the ocean. In short, the “new powers” proposed for the regulator aren’t new and aren’t real powers. Don’t expect much to change.

 

please share the news

All over the world ???

 

 

Enjoyed this article? Stay informed by joining our newsletter!

Comments

You must be logged in to post a comment.

Related Articles
About Author

Whtsap up buddy???