Photography tools

Facebook adds group moderation tools to limit misinformation

Image of article titled Facebook adds tools for group admins to limit the spread of misinformation

Picture: Facebook

Facebook announced on Wednesday that it deployment new anti-disinformation features aimed at reducing the amplification of the spread of potentially harmful content within Facebook groups.

Users can now choose to automatically disapprove posts from sources that Facebook’s fact checkers have identified as containing false information. Facebook hopes that rejecting these posts before other users have ever had a chance to interact with them will ultimately “reduce the visibility of misinformation”.

The company said it would expand its “mute” feature, adding the ability for group admins and moderators to temporarily suspend members from posting, commenting or reacting in a group. The suspend feature will also allow admins and mods to temporarily block certain users’ ability to access group chats or enter a room in a group. The new features will allow administrators to automatically approve or deny member requests based on specific criteria of their choosing.

Finally, Facebook announced that it will also introduce new updates to its admin onboarding features, providing group admins with more tools like an overview page and a summary information feature to help community management. Once combined, Facebook said it hopes to put more enforcement power and judgment in the arms of group leaders. This accountability of admins and moderators seems to be taking a page out of the playbook of Redddit, whose moderators have such discretion that the social network has become known for presenting different, sometimes wildly opposing, content standards among disparate communities. Facebook did not immediately respond to Gizmodo’s request for comment for more details on the tools or when they will be released.

Contacted for comment by Gizmodo, Facebook provided a list of its past efforts to combat misinformation in groups. The company said in a statement: “WWe have been doing a lot to keep FB groups safe for several years… To fight misinformation on Facebook, we’re taking a ‘suppress, reduce, inform approach that relies on a global network of independent fact checkers.

This isn’t the first time Facebook has tried to introduce tools to encourage group leaders to clean up their communities. Last year the company introduced the possibility for administrators to appoint designated “experts” in their groups. The profiles of these experts would appear with official badges next to their names intended to signal to other users that they were particularly knowledgeable on a given subject.

For some context, Facebook the partners with around 80 different independent organisations, including the Associated Press, The Dispatch, USA Today and others, all of which are certified by the Independent Fact-Checking Network to review content. These fact checkers identify and review questionable content to determine whether or not any of them can rise to the level of misinformation. The fact-checking program began almost six years ago.

Critics have long sharp to Facebook’s relatively hands-off approach to limiting content on groups as an instrument to help create mini-incubators of misleading content all over the web. Others have blame Groups specifically for contributing to the rise of fringe political elements like QAnon and the Stop The Steal movement that ultimately fueled the Capitol Riot on January 6. An analysis from the Washington Post conducted earlier this year, at least 650,000 posts questioning the legitimacy of the election were circulating on Facebook groups between election night and the riots, averaging about 10,000 posts per day.

Facebook’s modest changes come amid heightened public fear of the risk of misinformation related to Russia’s invasion of Ukraine. Fake images and videos (some of video gameplay) supposedly highlighting the fighting raging across the country spread like wildfire just hours after the invasion began.