Facebook: Meta Asks Oversight Board to Advise on COVID-19 Misinformation Policies

Home » blog » Facebook: Meta Asks Oversight Board to Advise on COVID-19 Misinformation Policies
Facebook: Meta Asks Oversight Board to Advise on COVID-19 Misinformation Policies

Meta is asking the Oversight Board for advice on whether measures to address dangerous COVID-19 misinformation, introduced in extraordinary circumstances at the onset of the pandemic, should remain in place as many, though not all, countries around the world seek to return to more normal life.

Misinformation related to COVID-19 has presented unique risks to public health and safety over the last two years and more. To keep our users safe while still allowing them to discuss and express themselves on this important topic, we broadened our harmful misinformation policy in the early days of the outbreak in January 2020. Before this, Meta only removed misinformation when local partners with relevant expertise told us a particular piece of content (like a specific post on Facebook) could contribute to a risk of imminent physical harm. The change meant that, for the first time, the policy would provide for removal of entire categories of false claims on a worldwide scale.

As a result, Meta has removed COVID-19 misinformation on an unprecedented scale. Globally, more than 25 million pieces of content have been removed since the start of the pandemic. Under this policy, Meta began removing false claims about masking, social distancing and the transmissibility of the virus. In late 2020, when the first vaccine became available, we also began removing further false claims, such as the vaccine being harmful or ineffective. Meta’s policy currently provides for removal of 80 distinct false claims about COVID-19 and vaccines. 

Meta remains committed to combating COVID-19 misinformation and providing people with reliable information. As the pandemic has evolved, the time is right for us to seek input from the Oversight Board about our measures to address COVID-19 misinformation, including whether those introduced in the early days of an extraordinary global crisis remains the right approach for the months and years ahead. The world has changed considerably since 2020. We now have Meta’s COVID-19 Information Center, and guidance from public health authorities is more readily available. Meta’s COVID-19 Information Center has connected over two billion people across 189 countries to helpful, authoritative COVID-19 information.

The pandemic itself has also evolved. In many countries, where vaccination rates are relatively high, life is increasingly returning to normal. But this isn’t the case everywhere and the course of the pandemic will continue to vary significantly around the globe — especially in countries with low vaccination rates and less developed healthcare systems. It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in.

Meta is fundamentally committed to free expression and we believe our apps are an important way for people to make their voices heard. But some misinformation can lead to an imminent risk of physical harm, and we have a responsibility not to let this content proliferate. The policies in our Community Standards seek to protect free expression while preventing this dangerous content. But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic. That’s why we are seeking the advice of the Oversight Board in this case. Its guidance will also help us respond to future public health emergencies. 

The Oversight Board was established to exercise independent judgment, operating as an expert-led check and balance for Meta, with the ability to make binding decisions on specific content cases and to offer non-binding advisory opinions on its policies. We are requesting an advisory opinion from the Oversight Board on whether Meta’s current measures to address COVID-19 misinformation under our harmful health misinformation policy continue to be appropriate, or whether we should address this misinformation through other means, like labeling or demoting it either directly or through our third-party fact-checking program.