Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customize your settings, hit “Customize Settings”.

Review and manage your consent

Here's an overview of our use of cookies, similar technologies and how to manage them. You can also change your choices at any time, by hitting the “Your Consent Options” link on the site's footer.

Manage Cookie Preferences
  • These cookies are strictly necessary so that you can navigate the site as normal and use all features. Without these cookies we cannot provide you with the service that you expect.

  • These cookies are used to make advertising messages more relevant to you. They perform functions like preventing the same ad from continuously reappearing, ensuring that ads are properly displayed for advertisers, and in some cases selecting advertisements that are based on your interests.

  • These cookies collect information in aggregate form to help us understand how our websites are being used. They allow us to count visits and traffic sources so that we can measure and improve the performance of our sites. If people say no to these cookies, we do not know how many people have visited and we cannot monitor performance.

See also our Cookie policy and Privacy policy.

This article is more than 1 year old

Brit regulator Ofcom put at helm as hosting platforms threatened with hefty fines for violent videos

Interim measure until Brexit, or never... whichever happens first

The UK government has threatened hosting platforms with big fines for providing access to unpleasant videos and will task UK comms regulator Ofcom with looking after how that happens.

The rules will come into force from 19 September in order to comply with European Union regulations. They will be in force until 31 October. If the UK leaves the union without a deal on that date, the rules may or may not continue to be enforced. In the longer term, the idea is to set up a new regulator to oversee such content.

The change will extend the Audio Visual Services Directive to video sharing and streaming services and not just broadcast and video on demand services. It aims to protect children from violent, pornographic or "extremist" content.

Member states can decide what punishments to impose, but the UK government seems inclined to extend Ofcom's existing powers to impose fines on broadcasters and on-demand services of £250,000 or 5 per cent of revenue to Video Sharing Platforms (VSPs).

Ofcom sent us the following statement: "These new rules are an important first step in regulating video-sharing online, and we'll work closely with the Government to implement them. We also support plans to go further and legislate for a wider set of protections, including a duty of care for online companies towards their users."

The Department for Digital, Media, Culture and Sport is still consulting on how the rules will work in practice and there is plenty of room for confusion. Services will either be required to apply to Ofcom for a licence, notify Ofcom or just agree to follow Ofcom's rules.

The consultation document notes possible confusion over exactly which sites and services will be defined as "Video Sharing Platforms". Ofcom called on the government to provide more clarity while also admitting that the changing nature of the market might make this difficult.

But the directive will exclude "video clips embedded in the editorial content of electronic versions of newspapers and magazines and animated images such as GIFs".

The government aims to make the industry pay for any further regulation activities.

Alastair Graham, chair of the Age Verification Providers Association and CEO of AgeChecked, said of the move: "Young people have more access to online content than ever before; in fact, our recent research found that 59 per cent of children have already started using platforms such as Facebook and Twitter before the age of 10 – despite the minimum age requirement being 13.

"Whilst the ever-growing market of technologies can be of great benefit to children, they also pose unprecedented risks. It's therefore encouraging to see appropriate measures – such as robust, integrated age verification systems – being enforced to ensure young people are better protected from potentially harmful material."

There's more on the consultation here (PDF). ®

 

Similar topics

Similar topics

Similar topics

TIP US OFF

Send us news


Other stories you might like