Forgets GIFs and avatars, Facebook needs to work on its fake news problem
You’ve probably already seen the Facebook avatars surfacing on your news feed – they look like blemish-free, plasticised versions of your mates with rigor mortis smiles. But amongst those many grinning faces, you can often spot even more ominous posts about the “hidden dangers” of vaccines and 5G.
The avatars aren’t the only new trick we’re likely to see from Facebook in the coming weeks – the company has just acquired Giphy and also launched a new “Facebook shops” feature, so your feed will soon be filled with cutesy moving images and commercial tat. But these cheery posts can’t hide the fact that misinformation is still being dispersed across the platform.
Earlier this month, a trailer for an upcoming “documentary” called Plandemic was widely circulated across Facebook, in which a discredited virologist speculates that billionaires created Covid-19 to boost vaccine sales. Facebook removed the video, but it was still seen by nearly 2 million people according to Digital Trends.
Related: How to delete a Facebook account
In an effort to combat this sort of activity, Facebook had rolled out a “warning label” on videos that potentially contained misinformation, with 50 million used so far. Zuckerberg confirmed in a CBS interview that these labels are about 95% effective in discouraging people from opening a link – but that figure still leaves enough wiggle room for a 5% click rate, and there are presumably fake posts out there that haven’t been labelled.
The problem is Facebook was created so people could share silly holiday snaps and insights into their own lives – it wasn’t designed to rigorously test the reliability of a source. Because of this fake news can go viral across Facebook in the same way that a prank video or a cat GIF can achieve internet notoriety.
Related: How to stop Facebook tracking your web browsing
And while news publishers are subject to laws surrounding truthful reporting, Facebook isn’t, because there’s a blurred legal line between the responsibility of the company and that of the individual posting the misinformation.
At the beginning of this week, Mark Zuckerberg had a debate with EU industry chief Thierry Breton, who said that the responsibility for stopping the spread of fake news stops with Zuckerberg himself. And unless the CEO starts trying a bit harder to combat deep-fakes and misinformation, he could face some serious new legislation.
Facebook reported earnings of $4.9bn in the first quarter of this year, so it must have enough resources to tackle this issue – and if it doesn’t, then maybe it can redirect some of the effort it’s putting into commercial features and cheesy avatars and aim them at combating fake news instead.