How Facebook is trying to make its groups less toxic


Slowing down the pace of conversations on the platform is the bet with which Mark Zuckerberg wants to defuse at least part of the conflicts that plague his social network

There was a moment in the history of Facebook, not long ago, where it looked like groups would be the future of the platform. The year was 2019, the Cambridge Analytica scandal had recently shown how sensitive data of millions of users had been used in campaigns in favor of Brexit and the election of Donald Trump in 2016, and at the annual conference dedicated to programmers Mark Zuckerberg unveiled what was to be the biggest breakthrough in platform design since its launch in 2004: in practice, a series of changes that put groups at the center.

We are focusing on building the digital equivalent of the living room, where you can interact in any way you want in private”, had announced all’epoca Zuckerberg. “There is a real opportunity to connect larger numbers of people through groups, which will become a significant social infrastructure in our lives. If we can improve service and connect one billion people to communities, we can strengthen our social fabric“. The idea was that of counter polarization and the increasingly well-established abuse mechanics that infested the app. The fact that this went hand in hand with the desire to keep users on the site for as long as possible and collect their data did not hurt.

Two years later, Zuckerberg can’t say he won the bet. The same groups that in his hopes were to strengthen the social fabric were, among other things, central in the rise of the Stop the steal movement – who refused for months to recognize Joe Biden’s victory in the November 2020 election – and QAnon, as well as in feeding the anti-vaccine rhetoric internationally. The company itself was aware of what was happening: to quote an investigation of the Wall Street Journal, “Facebook executives have been aware for years that the tools that fueled the rapid growth of groups were an obstacle to their efforts to build healthy online communities.”.

Now Facebook seems to want to retrace its steps, once again chasing the dream of cultivating groups that are healthier and less toxic spaces for their users. To slow down conversations and allow users to think a little more before expressing themselves on issues with a high emotional impact, for example, the social network will shortly introduce globally a new tool which will allow moderators to limit comments to posts for a given period of time or for specific users – for example those who have only recently joined Facebook or who have violated community rules previously. They will also be able to limit the frequency of comments from specific group members. Everything will be managed through a new space, the home of the administrators, where admins will be able to see what interventions are needed in the group with respect to posts, members and recommendations. The platform is also testing a feature based on artificial intelligence that would allow it to automatically identify “controversial or unhealthy conversations”Which take place in the comments – albeit did not explain how it would work.

Though small, with this particular update it’s an admission that building a healthy online community means that sometimes people shouldn’t be able to react and comment immediately with whatever thought comes to mind.”, commented reporter Sarah Perez su TechCrunch, criticizing one of the key assumptions that has been the basis of social interactions for years.

As he explained in unsuspecting times the designer Nick Punt, the way in which online spaces are currently conceived, from Facebook groups to the Twitter timeline, means that “the reputation, perspectives and history of each individual are difficult to ascertain, and therefore their words must be taken literally. This, coupled with an almost total lack of standards in community participation and a high degree of variance in participants’ knowledge, causes the environment to naturally tend towards conflict and tribalism.“. Could slightly slowing down the pace of an excited conversation allow us to defuse at least some of these conflicts? This seems to be Zuckerberg’s new bet.


Categories:   Internet

Comments