- Our comprehensive strategy to protect the US 2020 elections began years before the election cycle even began and was designed to last through the inauguration. This included implementing some temporary measures that were only a small part of our larger strategy.
- We expanded our policies in 2020 to disrupt militias from our services and have since banned over 890 militarized social movements. We also put rules in place prohibiting QAnon and militia groups from organizing on our platform.
- As with any significant public event, debate about the election results would inevitably show up on Facebook. But responsibility for the insurrection lies with those who broke the law during the attack and those who incited them, not on how we implemented just one series of steps we took to protect the US election.
Long before the US election period began last year, we expected that the 2020 election would be one of the most contentious in history — and that was before we even knew it would be conducted in the midst of a pandemic. We worked since 2016 to invest in people, technologies, policies and processes to ensure that we were ready, and began our planning for the 2020 election itself two years in advance. We built our strategy to run all the way through Inauguration Day in 2021, knowing that there was a high likelihood that the election results would be contested. So we planned specifically for that scenario. This election planning was built upon all of our other integrity work and investments we’ve made since 2016.
These included:
- Building a global team of 35,000 people to work on safety and security. Today it is more than 40,000.
- Empowering more people to participate, including by helping to register more than 4.5 million voters in the US last year
- Activating our Elections Operations Center, staffed by subject matter experts from across Facebook to identify and respond to different threats we see across the platform — and keeping it in place for the US election and throughout the post-election period until past the inauguration
- Going after covert influence operations that sought to interfere on our platform, similar to what happened in 2016. As a result we took down networks targeting the US, including five networks engaged in coordinated inauthentic behavior from Russia, five from Iran, one from China, and five domestic US-origin networks.
- Expanding our policies and investments to remove militia groups and prevent QAnon from organizing on our platform
- Taking additional steps to control the virality of potentially harmful content. These include things like demoting content that our systems predict to be violence and incitement. They also include temporarily reducing the distribution of content our systems predict may be false pending review by a third-party fact-checker.
- Phasing out News Feed ranking models starting in late 2019 which up-rank content that we predict the viewer will re-share it and other people will engage with. These changes targeted content predicted to be about politics or social issues and they remain in place today.
- Prohibiting voter suppression content — from March 2020 to Election Day, we removed more than 265,000 pieces of Facebook and Instagram content in the US for violating our voter interference policies.
- Partnering with third-party fact-checkers to help us label election-related misinformation and remove content that breaks our rules
- Providing more transparency about political advertising on the platform through our Ads Library so people can see who is behind these ads
- Requiring anyone who runs political and social issue ads on Facebook to prove their identity and be authorized to run them
- Suspending new political and social ads for the seven days before the election and preventing any new political ads from starting in that period, and then pausing these ads completely between Election Day and the inauguration
- Creating a first-of-its-kind Voting Information Center to make sure people had reliable information about the election and how to vote. Once results came, the Voting Information Center promoted the accurate election results, and we kept this in place long after Election Day.
- Placing a notification at the top of Facebook and Instagram making it clear that Joe Biden was the projected winner once a majority of independent decision desks at major media outlets, including ABC, CBS, Fox, NBC, CNN and the AP, called it.
- Adding labels on posts about voting and the election, including from politicians, with a link to the Voting Information Center for people to get the latest updates on vote counting and the results. Following the election, we also applied labels with the projected winner on all presidential candidates’ posts with a link to our Voting Information Center to see more about the election results. And, we added labels to content that misrepresented the election or vote-counting process, when we became aware of it, that included information from the Bipartisan Policy Center.
- Putting a series of temporary product measures in place where there were specific risks that spikes in activity on the platform could mean that the many systems we had in place to enforce our policies may not be able to keep up. An example includes limiting the distribution of live videos that our systems predicted may relate to the election, and automatically removing potentially violating content at lower confidence levels than we normally would have even before they are reviewed by our team. We took these steps to respond to specific signals we were seeing on the platform, such as spikes in reported content — and turned some of them off responsibly and gradually as those signals returned to their previous levels. We also left many of them in place through Inauguration Day.
While that last point is important, as you can see, it was only one part of a much longer series of steps that we took well before, during and after Election Day. When preparing for the election, we planned for multiple potential outcomes and considered many societal factors to understand and respond to violence.
That’s a big part of the reason why we developed these additional product levers for extraordinary circumstances, which we called internally “break the glass” measures. That’s also why we kept our full suite of systems including many of the “break the glass” measures in place well after Election Day and even after we saw specific signals about potential threats leveling off and more than a month had passed since major news outlets called the election for now-President Joe Biden.
To blame what happened on January 6 on how we implemented just one item of the above list is absurd. We are a significant social media platform so it’s only natural for content about major events like that to show up on Facebook. But responsibility for the insurrection itself falls squarely on the insurrectionists who broke the law and those who incited them. We worked with law enforcement in the days and weeks after January 6 with the goal of ensuring that information linking the people responsible for it to their crimes is available. Of course there are always lessons to be learned from the work we do to protect elections and respond to immediate threats and longer-term challenges. We will apply these lessons as we continue doing all of this work.
Source