The 'Facebook Papers' continue to shed light on some of the darkest secrets of the company led by Mark Zuckerberg. Now it is the turn of the reactions with emojis and how the social network prioritized them “I get angry” over “I like”, thus fostering feelings of anger. In addition, it helped to systematically expand the scope of controversial posts.
The American newspaper The Washington Post, based on the documents revealed by Frances Haugen, points out that Facebook programmed the social network's algorithm to promote more emotional and provocative content . Additionally, starting in 2017, it gave emoji reactions five times more value than traditional likes.
Precisely, this happened when the social network tried to reverse a sharp decrease in the amount of published content and interaction between users. The pattern that drastically strengthened the value of emoji reactions meant that posts with them could reach a wider audience.
Despite warnings from the internal team, which noted that this could favor “controversial” posts , including those that produce anger, spam and clickbait, Facebook decided to move on. It is that emojis had turned out to be an ideal feature to keep users on the platform and thus further boost its advertising business model.
A 2019 study by Facebook data scientists confirmed the warnings. It noted that posts that primarily received “I get angry” emojis had a “disproportionate likelihood of including misinformation, toxicity, and low-quality news.” Consequently, the essence of the social network was expanding the worst of itself in order to retain users.
Facebook, inconsistencies indoors and outdoors
Credit: Unsplash On the one hand, the mechanics of the social network favored controversial content and generated feelings of anger. On the other, the already inefficient – and questioned – moderation team was trying to stop an avalanche of negative publications that was driven indoors.
“ Anger and hatred is the easiest way to grow on Facebook ,” Haugen said last Monday. Facebook spokesperson Dani Lever said the company is working to understand what content creates “negative experiences” and thus be able to reduce its distribution. “This includes content that has a disproportionate amount of 'It makes me mad' reactions,” he added.
Beyond the explanations of the company and the possible changes that it may carry out, the disclosed documents make it clear what their priority is: maintain the permanence of users even at a very high cost . The Washington Post notes that some employees questioned the company's practices. However, “sometimes their comments were rejected”.