Fighting Misinformation at Facebook: Press Fact Sheet

We share society’s concern with misinformation which is why we have taken aggressive steps to combat it – from creating an unparalleled global network of fact-checking partners and promoting accurate information to removing content when it breaks our rules. Misinformation is complex and constantly evolving, which is why we continue to consult with outside experts, grow our fact-checking program, and improve our internal technical capabilities.

 

Defining Misinformation

Misinformation is false information that is often shared unintentionally. The content is shared on an individual basis and is not part of any coordinated attempt to mislead or deceive people.

Disinformation refers to sharing content with the deliberate intent to mislead as part of a manipulation campaign or information operation. This activity is coordinated and can involve the use of fake accounts. We don’t tolerate this activity and take these actors and their content down as soon as we become aware of them. We also disclose our work in this space via a monthly report in our Newsroom.

Our Approach to Misinformation

We have a three-part strategy for addressing misinformation on Facebook – Remove, Reduce and Inform. Part of this strategy is our third party fact checking program.

 

Third Party Fact Checking

We do not believe any single entity – either a private company or government – should have the power to decide what is true and what is false. When one single actor is the arbiter of truth, there is a power imbalance and potential for overreach. With this in mind, we rely on independent fact-checkers to identify and review potential misinformation, which enables us to take action.

We partner with over 80 independent third party fact checkers globally, working in over 60 languages. Our partners have been certified through the independent, non-partisan International Fact-Checking Network. More information can be found here. Our third party fact checkers in Sri Lanka are AFP and Fact Crescendo.

Remove

We have policies in place to address some of the most harmful types of false information and we REMOVE this content as soon as we become aware of it.

We remove COVID-19 misinformation that could contribute to imminent physical harm including false claims about cures, treatments, the availability of essential services, or the location and severity of the outbreak. We also remove false claims in relation to the COVID-19 vaccine that have been debunked or are unsupported by evidence such as false claims about the safety, efficacy, ingredients or side effects of COVID-19 vaccines. Since March last year, we removed 16 million pieces of COVID-19 misinformation content.

Our advertising policies have long-prohibited misleading claims, but we’ve implemented new advertising policies in relation to COVID-19.

For misinformation or false information that doesn’t fall under these particular policies, it may be shared in a way that violates our other Community Standards – for example hate speech, bullying and harassment or spam, and we will remove anything we identify for violating these policies.

 

Reduce

When fact-checkers rate a story as false, altered or partly false, we significantly REDUCE its distribution in the Facebook News Feed and Instagram Feed so that less people see it. On Instagram, we also make it harder to find by filtering it from Explore and hashtag pages.

Pages and domains that repeatedly share false news will also see their distribution reduced and their ability to monetize and advertise removed. We also reduce the distribution of other spammy, sensational content like clickbait and engagement bait– which can also coincide with misinformation.

Inform

We INFORM people by giving them more context so they can decide from themselves what to read, trust and share.

 

On Facebook and Instagram

There are a number of different labels our third party fact checkers can choose from when they are rating content including False, Altered, Partly False, Missing Context and Satire. Content across Facebook and Instagram that has been rated false or altered is prominently labeled so people can better decide for themselves what to read, trust, and share. These labels are shown on top of false and altered photos and videos, including on top of Stories content on Instagram and link out to the assessment from the fact-checker. For content rated partly false or missing context, we’ll apply a lighter-weight warning label. To help scale the work of our third party fact checkers, we use artificial intelligence to identify identical or similar content to that which has been rated by our fact checkers and automatically apply labels or reduce the distribution of the content.

In 2018, we launched a context button which provides information about the sources of articles people see in News Feed. Last year, we launched a new notification to let people know when a news article they’re about to share is more than 90 days old.

Access to reliable information

We have a number of additional measures in place to keep people informed about COVID-19. In March 2020, we launched the COVID-19 Information Center at the top of News Feed, which includes real-time updates from health authorities, as well as helpful articles, videos and posts about social distancing and preventing the spread of COVID-19. People can also follow the COVID-19 Information Center to receive updates from health authorities directly in their News Feed. Through our COVID-19 Information Center we have connected over 2 billion people to resources from health authorities. We have also displayed warning labels over 167 million posts marked as false by third party fact checkers.

We have built a strong partnership with the Health Promotion Bureau (HPB) of Sri Lanka’s Ministry of Health to help build capacity, raise awareness about COVID-19-related health measures and support the vaccine roll-out. In April 2021, we launched a public education campaign to tackle misinformation and we continue to direct people to the HPB’s website for latest directives.

Digital literacy programs

We invest in a range of programs, and partner with civil society organizations as well as industry partners on different initiatives to address the underlying issue of digital and news literacy. Last year, Facebook’s flagship digital literacy program We Think Digital was launched in Sri Lanka in partnership with Sarvodaya-Fusion and supported by the Ministry of Education and Information and Communication Technology of Sri Lanka (ICTA) to develop skills that enable people to create a positive and safe culture online. This year, we partnered with Sri Lanka’s Vocational Training Authority (VTA) to launch a series of training sessions to help people upskill their digital capabilities.

The post Fighting Misinformation at Facebook: Press Fact Sheet appeared first on Adaderana Biz English | Sri Lanka Business News.

Source From biz.adaderana
Author: Biz Editor
#SriLanka #News #lka