Google is under fire once again over claims of dodgy content on its YouTube Kids app.
Child and consumer advocacy groups have complained to the US Federal Trade Commission that the app contains age-inappropriate content.
The complaint claims that videos on the app are ‘extremely disturbing and/or potentially harmful for young children to view’.
The welfare consortium has lambasted the Google-owned platform for showing content relating to alcohol, drugs, sex, child abuse, suicide, and Michael Jackson grabbing his crotch.
The groups posted a montage of YouTube Kids clips lifted on May 5 to Vimeo, highlighting their concerns.
YouTube Kids was launched back in February on iOS and Android, in an effort to provide children with an age-appropriate platform for viewing videos.
Related: Best iPhone Games 2015
Unfortunately, this is not the first time that the service has been at the centre of controversy. Just last month advocacy groups claimed the app was blending videos and advertisements in a ‘deceptive’ way.
“The videos provided to children on YouTube Kids intermix commercial and other content in ways that are deceptive and unfair to children and would not be permitted to be shown on broadcast and cable television,” read a letter by several child and consumer welfare groups.
Google maintains that users can flag videos that are deemed to be inappropriate, which Google will manually review and, if necessary, remove.
Parents also have the option to disable search within the app, to fend off dodgy clips showing up in search results.