New regulationsThe UK online regulations authority has laid out a number of measures video-sharing platforms (VSP) are expected to take to protect their users or else they will face hefty fines.
The VSPs, including TikTok, Snapchat, Vimeo, and Twitch, are all ordered to take “appropriate measures” to protect users from content that is related to child sexual abuse, racism, and terrorism. The move comes after research done by Ofcom revealed that a third of users have seen hateful content on these sites.
Strict consequencesVSPs will be required to provide and enforce clear rules for uploading content, make reporting and complaints process as easy as possible, and restrict access to adult sites with robust age verification. The VSPs will be under the microscope, with the promise of an Ofcom report next year that will determine if and how these regulations have been implemented.
YouTube isn’t expected to avoid the regulations either. The main video-sharing platform is expected to fall under the Irish regulatory regime, but also comes under the Online Safety Bill, which, when it becomes law, will offer much broader remissions for tackling harmful content on the bigger technology platforms like Twitter, Facebook, and Google.