A controversial internal memo to Facebook staff by Andrew Bosworth, VP of AR and VR at Facebook, has leaked to the New York Times.
The memo, entitled “Thoughts for 2020” accepts some of the media criticism aimed at the company, while disputing the coverage of the Cambridge Analytica scandal (“a total non-event”) and assuring employees that Facebook should not change its political advertising practices even though this “very well may lead” to a Trump re-election in November.
“So was Facebook responsible for Donald Trump getting elected? I think the answer is yes, but not for the reasons anyone thinks,” wrote Bosworth, who was working in the ads department at the time of the election. “He didn’t get elected because of Russia or misinformation or Cambridge Analytica. He got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser. Period.”
Related: How to delete Facebook
This, Bosworth writes, wasn’t down to misinformation, hoaxes or aggressive microtargeting. “They just used the tools we had to show the right creative to each person. The use of custom audiences, video, ecommerce, and fresh creative remains the high water mark of digital ad campaigns in my opinion.”
Trump’s election was, Bosworth says, nothing to do with Cambridge Analytica – a company that tried to share in the success after the event. “In practical terms, Cambridge Analytica is a total non-event,” he wrote. “They were snake oil salespeople. The tools they used didn’t work, and the scale they used them at wasn’t meaningful. Every claim they have made about themselves is garbage..”
Bosworth describes himself as a “committed liberal,” and added that he finds himself “desperately wanting to pull any lever at my disposal to avoid the same result,” but states that Facebook shouldn’t do this, using the work of Tolkien to make his point.
“I find myself thinking of the Lord of the Rings at this moment. Specifically when Frodo offers the ring to Galadrial [sic.] and she imagines using the power righteously, at first, but knows it will eventually corrupt her,” he writes. “As tempting as it is to use the tools available to us to change the outcome, I am confident we must never do that or we will become that which we fear.”
Bosworth made it clear that Facebook does believe it has a duty to step in at times. “Things like incitement of violence, voter suppression, and more are things that same moral philosophy would safely allow me to rule out,” he added.
Related: Facebook privacy settings
The wide ranging post also comments that filter bubbles – a bête noire of many Facebook critics – aren’t as big a problem as polarisation. “What happens when you see 26% more content from people you don’t agree with?,” he asks. “Does it help you empathise with them as everyone has been suggesting? Nope. It makes you dislike them even more.
“This is also easy to prove with a thought experiment: whatever your political leaning, think of a publication from the other side that you despise. When you read an article from that outlet, perhaps shared by an uncle or nephew, does it make you rethink your values? Or does it make you retreat further into the conviction of your own correctness?”
Perhaps more surprisingly, Bosworth accepts that social media isn’t universally good for people. “While Facebook may not be nicotine I think it is probably like sugar. Sugar is delicious and for most of us there is a special place for it in our lives. But like all things it benefits from moderation.”
After the New York Times published, Bosworth republished the whole post on his public Facebook wall, noting that it wasn’t intended for public consumption, but that it was important to see it in context. “My day to day work doesn’t cover the issues discussed, so for example I’m not responsible for the teams working on misinformation or civic integrity,” he said, while adding that several of the internal comments his post had received had “changed my views.”