Facebook is the world's largest social media platform and moderating it is no small task. Though each Facebook group and page has a moderator of its own, the issue of fake news and that shapes the mindset of the people needs to be tackled. With this in mind,the social media platform will hold admins of the Facebook Groups more accountable for the Community Standards violations.
"When people in a group repeatedly share content that has been rated false by independent fact-checkers, we will reduce that group's overall News Feed distribution. Starting today, globally," Guy Rosen, Vice President of Integrity at Facebook said in a blog post on late Wednesday.
Facebook said that starting in the coming weeks, when reviewing a Group to decide whether or not to take it down, it will look at admin and moderator content violations in that Group -- including member posts they have approved as a stronger signal that the group violates its standards.
"We're also introducing a new feature called Group Quality, which offers an overview of content removed and flagged for most violations, as well as a section for false news found in the group," added Tessa Lyons, Head of News Feed Integrity at Facebook.
The company has incorporated a "Click-Gap" signal into News Feed ranking. "Click-Gap" looks for domains with a disproportionate number of outbound Facebook clicks compared to their place in the web graph. "This can be a sign that the domain is succeeding on News Feed in a way that doesn't reflect the authority they've built outside it and is producing low-quality content," said Facebook.
Apart from that, the company is also expanding the Context Button to images on Instagram. Launched in April 2018, the Context Button feature provides people more background information about the publishers and articles they see in News Feed so they can better decide what to read, trust and share.
"We're testing enabling this feature for images that have been reviewed by third-party fact-checkers," said Facebook. Facebook said it will bring the "Verified Badge" into its Messenger service. "This tool will help people avoid scammers that pretend to be high-profile people by providing a visible indicator of a verified account," said the company.
These new features launched by the company aim to make Facebook a safer platform for all. With the multitude of users who use the platform on a daily basis, it is vital that such a management system was put in place.