The New York Times

The QAnon conspiracy theory, promotions of bogus health treatments and calls for violence based on false claims of election fraud have a common thread: Facebook groups.

These forums for people with a shared interest can be wonderful communities for avid gardeners in the same neighborhood or parents whose children have a rare disease. But for years, it’s also been clear that the groups turbocharge some people’s inclinations to get into heated online fights, spread engrossing information whether it’s true or not and scapegoat others.

I don’t want to oversimplify and blame Facebook groups for every bad thing in the world. (Read my colleague Kevin Roose’s latest column on suggestions for how to target polarization and engage people in purposeful activities.) And mitigating the harms of Facebook is not as simple as the company’s critics believe.

But many of the toxic side effects of Facebook groups are a result of the company’s choices. I asked several experts in online communications what they would do to reduce the downsides of the groups. Here are some of their suggestions.

Stop automated recommendations. Facebook has said it would extend a temporary pause on computerized recommendations for people to join groups related to politics. Some experts said that Facebook should go further and stop computer-aided group suggestions entirely.

It’s nice that Facebook suggests a forum about growing roses to someone who posts about gardening. But for years, Facebook’s group recommendations have proved to be easily manipulated and to have pushed people toward increasingly fringe ideas.

Read the full story here.