YouTube’s problem with dodgy videos is about much more than just porn
The staggering extent of dodgy content uploaded to YouTube has been laid bare, after the Google-owned video streaming giant released its first ever Community Guidelines Enforcement Report.
Featuring data from October-December 2017, the Community Guidelines Report reveals that some 8.3 million videos were taken down by the platform during the three month period for violating its standards.
While the vast majority of these were automatically flagged by Google’s machine army, the sky-high figure includes a worrying 1.6 million user reports of live child abuse videos, and over 490,000 human complaints regarding extremist content.
The largest number of offensive clips spotted by real people? Perhaps unsurprisingly, sexually explicit videos accounted for the most user-generated complaints in Q4 2017 at nine million flags.
With up to 300 hours of content uploaded to YouTube every minute, it’s clear staying on top of dodgy videos represents a Herculean task for the service.
Adding to the challenge is the fact that many incendiary videos don’t actually appear to breach any obvious community guidelines. For instance, a quick search by Trusted Reviews for controversial far-right youth movement ‘Generation Identity’ brought up no end of content featuring questionable sentiment – most of which cleverly tread a fine line and seemed protected by freedom of speech.
Related: Trusted Reviews YouTube channel
While YouTube already employs a dedicated team of human reviewers to vet the flags its AI algorithms pick up as well as independently investigate offensive content, the company has quietly admitted its community team is woefully understaffed, revealing it intends to hire a further 10,000 staffers to sit on the video review team.
Do you think YouTube does a good job policing offensive content? Tweet your thoughts to us @TrustedReviews.