Following on from growing criticism of groups on Facebook spreading extremist views, the platform has acted by clamping down on its closed communities with new changes being implemented.
In a bid to be more proactive in detecting problematic content, they’ve announced that they will work to enhance the transparency surrounding Groups by introducing a new tool called Group Quality.
The aim of Group Quality is to use artificial intelligence to crawl through groups for any content that violates their platform’s rules and punishes them accordingly. Additionally, Group administrators will be given more control over what is put into groups by members. In a separate announcement, it was declared that Administrators will now be able to choose to make Groups either public or private in a bid to remove certain groups from search results.
Groups that used to be “secret” will now be “private” and “hidden.” Groups that were “closed” before will now be “private” and “visible.” Anything set to public before will remain that way. Why has Facebook clamped down on Groups?
It will come as no surprise that this new focus on Groups stems from racist and highly offensive communities spawning on the platform. There has been a slew of far-right terrorist acts committed recently that the perpetrators’ online activities have linked back to the likes of Reddit, 4Chan and Facebook Groups – so it’s a good thing that Facebook is taking action.
However, critics have hit back saying that the changes may help push questionable content further underground to even more undetectable depths. By administrators being able to change their privacy status, it opens up the risk of making it more difficult for content violations to be identified and reported by outsiders.
“Being in a private group doesn’t mean that your actions should go unchecked,” Tom Alison, Facebook’s vice president of engineering said in a statement. Although I wholeheartedly agree, I also think Groups should be transparent and while admins can dictate how searchable their group their groups are – there will always be a disconnect between regulation and extreme online behaviour.
Good intentions, but…
Earlier this year, Facebook pushed its Groups and Communities features in a bid to gather like-minded individuals together to discuss common interests. The purpose was to help users create a group for anything —from family reunions to after-work sports teams and book clubs.
However, this exercise has instead created online echo chambers of extremism and hate speech that are now in the process of being shut down. Policy change could have major implications for Groups as previously Facebook has proven more receptive to removing Pages that repeatedly violate its community standards.
Pages have often been set up as one of a large network. For example, Facebook has been criticised for keeping the Infowars store page up on the site after banning Alex Jones and Infowars previously. At the time, a Facebook spokesperson said the Infowars store page hadn’t violated its community standards yet.
The question, ‘Why has Facebook clamped down on Groups?’ is open to interpretation and some say it’s highly subjective and depends on what individual people find offensive. But as Facebook users, we have to do as we’re told. Facebook has changed rules around Groups several times in recent years, and new policies may represent a step in the right direction, but it remains to be seen if this next move will make a difference.