In its bid to fight the spread of Covid-19 related misinformation on its platform, Facebook will now send notifications directly to users who like, share, or comment on such posts.
According to a report from Fast Company, the social network is changing how it reaches people who have encountered misinformation on its platform.
“The company will now send notifications to anyone who has liked, commented, or shared a piece of Covid-19 misinformation that’s been taken down for violating the platform’s terms of service,” the report said on Tuesday.
The notification will read: “We removed a post you liked that had false, potentially harmful information about Covid-19.”
The company will then connect users with trustworthy sources in effort to correct the record.
If a user interacts with a fake post that has been removed, Facebook will send a notification to the user telling them that the post was taken down.
“If the user clicks the notification, they’ll be taken to a landing page with a screenshot of the post and a short explanation for why it was removed”.
It will also offer follow up actions, like the option to unsubscribe from the group that originally posted the false information or to “see facts” about Covid-19.
Earlier this month, Facebook announced to step up its fight against the misinformation about Covid-19 vaccines, saying it will remove false claims about these vaccines that have been debunked by public health experts on its platform as well as Instagram.
Facebook said it will remove false claims that Covid-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list.
As part of their effort to reduce the spread of “vaccine hoaxes” on its platform, Facebook and its photo-messaging app Instagram said last month they will no longer allow advertisements that include misinformation about vaccines. (IANS)