Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

Facebook wants a “one-size-fits-all” regulatory system that can’t exist

Facebook has released a whitepaper that examines the changing online landscape and discusses the future of content regulation online. While the paper, which is titled ‘Charting a Way Forward’, seems reasonably well-intentioned and contains some genuine conversation around internet regulation, it seems that Facebook wants an easy “one-size-fits-all” solution that can never exist.

One excerpt reads:

“In the United States… the First Amendment protects a citizen’s ability to engage in dialogue online without government interference except in the narrowest of circumstances.

“Citizens in other countries often have different expectations about freedom of expression, and governments have different expectations about platform accountability. Unfortunately, some of the laws passed so far do not always strike the appropriate balance between speech and harm, unintentionally pushing platforms to err too much on the side of removing content.

“If governments do not agree on how to balance these interests when writing laws, it is likely they will have very different ideas of how to define online content standards, which require even more specificity than laws in order to ensure consistent application.”

Related: How to stop Facebook tracking you when you’re not on Facebook

It’s true, of course, that laws and cultural expectations are different in different countries, and that does make the job of regulators – whether they are Facebook’s own regulatory efforts, or ones from above – more difficult. However, it seems unlikely that those conditions are about to change.

To reinforce that point, earlier this week Thierry Breton, the EU’s industry commissioner, said: “It’s not for us to adapt to this company, it’s for this company to adapt to us,” after holding a meeting with Mark Zuckerberg. He clearly wasn’t too impressed with the company’s ideas on regulation.

However, it certainly seems true that some new thinking could benefit internet regulation. At present the proliferation of hate-speech, terrorist propaganda and other problematic content has become a much-talked about problem of social media and across the web.

In the whitepaper, the social media giant has asked for a strange mix of regulation and flexibility.

“Governments that seek to define for internet companies what content they should allow on their platforms should seek to… provide flexibility” the paper says, “so that platforms can adapt policies to emerging language trends and adversarial efforts to avoid enforcement.

“For instance, hateful speech, bullying, and threats of self-harm are often expressed through a lexicon of words that fall in and out of favour or evolve over time. Similarly, proscribed terror groups and hate groups may rename themselves or fracture into different groups”.

Related: How to delete a Facebook account permanently

While the example makes perfect sense, it’s attached to quite a big ask, essentially saying – ‘give us the power, in some instances, to decide what needs regulating’. That’s not to say there’s no way that solution could work, but it would likely spark contentious discussion.

Facebook also says that “regulators must be cautious” in applying any new regulations, in-case new additions make things worse, rather than better.

While that, and one or two other throw-away lines, made Facebook sound its usual sinister self, there was plenty of good, well-intentioned discussion in the paper.

One excerpt saw Facebook offer a useful example of the tension between regulator and regulated…

“For example, a requirement that companies ‘remove all hate speech within 24 hours of receiving a report from a user or government’ may incentivize platforms to cease any proactive searches for such content, and to instead use those resources to more quickly review user and government reports on a first-in-first-out basis.

“In terms of preventing harm, this shift would have serious costs. The biggest internet companies have developed technology that allows them to detect certain types of content violations with much greater speed and accuracy than human reporting.

“For instance, from July through September 2019, the vast majority of content Facebook removed for violating its hate speech, self-harm, child exploitation, graphic violence, and terrorism policies was detected by the company’s technology before anyone reported it.”

Overall, Facebook’s whitepaper is a strange mix, both tonally and in terms of its subject matter. It will be interesting to see how companies and governments respond – if they respond.

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Trusted Reviews Logo

Sign up to our newsletter

Get the best of Trusted Reviews delivered right to your inbox.

This is a test error message with some extra words