https://www.theguardian.com/technology/2020/oct/30/facebook-leak-reveals-policies-restricting-new-york-post-biden-story

2545.jpg

Facebook moderators had to manually intervene to suppress a controversial New York Post story about Hunter Biden, according to leaked moderation guidelines seen by the Guardian.

The document, which lays out in detail Facebook’s policies for dealing with misinformation on Facebook and Instagram, sheds new light on the process that led to the company’s decision to reduce the distribution of the story.

“This story is eligible to be factchecked by Facebook’s third-party factchecking partners,” Facebook’s policy communications director, Andy Stone, said at the time. “In the meantime, we are reducing its distribution on our platform. This is part of our standard process to reduce the spread of misinformation. We temporarily reduce distribution pending factchecker review.”

In fact, the documents show, the New York Post – like most major websites – was given special treatment as part of Facebook’s standard process. Stories can be “enqueued” for Facebook’s third-party factcheckers in one of two ways: either by being flagged by an AI, or by being manually added by one of the factcheckers themselves.

Facebook’s AI looks for signals “including feedback from the community and disbelief comments” to automatically predict which posts might contain misinformation. “Predicted content is temporarily (for seven days) soft demoted in feed (at 50% strength) and enqueued to fact check product for review by [third-party factcheckers],” the document says.

But some posts are not automatically demoted. Sites in the “Alexa 5K” list, “which includes content in the top 5,000 most popular internet sites”, are supposed to keep their distribution high, “under the assumption these are unlikely to be spreading misinformation”.

Those guidelines can be manually overridden, however. “In some cases, we manually enqueue content … either with or without temporary demotion. We can do this on escalation and based on whether the content is eligible for fact-checking, related to an issue of importance, and has an external signal of falsity.” The US election is such an “issue of importance”.

In a statement, a Facebook spokesperson said: “As our CEO Mark Zuckerberg testified to Congress earlier this week, we have been on heightened alert because of FBI intelligence about the potential for hack and leak operations meant to spread misinformation. Based on that risk, and in line with our existing policies and procedures, we made the decision to temporarily limit the content’s distribution while our factcheckers had a chance to review it. When that didn’t happen, we lifted the demotion.”

The guidelines also reveal Facebook had prepared a “break-glass measure” for the US election, allowing its moderators to apply a set of policies for “repeatedly factchecked hoaxes” (RFH) to political content. “For a claim to be included as RFH, it must meet eligibility criteria (including falsity, virality and severity) and have content policy leadership approval.”

The policy, which to the Guardian’s knowledge has not yet been applied, would lead to Facebook blocking viral falsehoods about the election without waiting for them to be debunked each time a new version appeared. A similar policy about Covid-19 hoaxes is enforced by “hard demoting the content, applying a custom inform treatment, and rejecting ads”.

Facebook acts only on a few types of misinformation without involving third-party factcheckers, the documents reveal. Misinformation aimed at voter or census interference is removed outright “because of the severity of the harm to democratic systems”. Manipulated media, or “deepfakes”, are removed “because of the difficulty of ‘unseeing’ content so sophisticatedly edited”. And misinformation that “contributes to imminent violence or physical harm” is removed because of the security of imminent physical harm.

The latter policy is not normally applied by ground-level moderation staff, but a special exception has been made for misinformation about Covid-19, the document says. Similar exceptions have been made to misinformation about polio in Pakistan and Afghanistan, and to misinformation about Ebola in the Democratic Republic of the Congo.

Facebook also has a unique policy around vaccine hoaxes. “Where groups and pages spread these widely debunked hoaxes about vaccinations two or more times within 90 days, those groups and pages will be demoted in search results, all of their content will be demoted in news feed, they will be pulled from recommendation systems and type-ahead in search, and pages may have their access to fundraising tools revoked,” the document reads.

“This policy is enforced by Facebook and not third-party factcheckers. Thus, our policy of not subjecting politician speech to factchecking does NOT apply here. If a politician shares hoaxes about vaccines we will enforce on that content.”

America is at a crossroads ...

... and the coming days will define the country for a generation. These are perilous times. Over the last four years, much of what the Guardian holds dear has been threatened – democracy, civility, truth.

The future of abortion and voting rights, healthcare, climate policy and much more hang in the balance. Science is in a battle with conjecture and instinct to determine policy in the middle of a pandemic. At the same time, the US is reckoning with centuries of racial injustice – as the White House stokes division along racial lines. At a time like this, an independent news organization that fights for truth and holds power to account is not just optional. It is essential.