Facebook is pushing yet another set of new features and policies designed to minimize harm in the homestretch to Election Day while also increasing “community” for users. But these features will do nothing to mitigate existing problems—and they will likely cause new, more widespread harms to both users and to society.
The most recent issue is a frustrating set of changes to the way that Facebook handles groups. Last week, Facebook announced yet another new way to “help more people find and connect with communities,” by putting those communities in your face whether you want to see them or not. Both the groups tab and your individual newsfeed will promote group content from groups you are not subscribed to in the hope that you will engage with the content and with the group.
These changes are new, small inconveniences piled atop frustrating user-experience decisions that Facebook has been making for more than a decade. But they are the latest example of how Facebook tries to shape every user’s experience through black box algorithms—and how this approach harms not only individuals but the world at large. At this point, Facebook is working so hard to ignore expert advice on how to reduce toxicity that it looks like Facebook doesn’t want to improve in any meaningful way. Its leadership simply doesn’t seem to care how much harm the platform causes as long as the money keeps rolling in.