Technology

The problem of Facebook is not a story of good guys or bad guys, it's something bigger

The problem of Facebook is not a story of good guys or bad guys, it's something bigger

Facebook is in the spotlight. The leak of documents by Frances Haugen, a former employee of the company, has unleashed a storm of criticism from journalists, lawyers and public opinion. The company prioritizes economic benefit at the expense of people, Haugen stated.

This Monday, a dozen US media that have had exclusive access to these documents, the so-called “Facebook Papers”, have published numerous articles that show how many of the efforts of their staff to alleviate the problems of the social network are ignored because they would negatively affect your advertising business. Haugen testified that Facebook does not want to stop the damage that the social network supposedly produces because it interferes with the profits and growth of the social network.

The founder, Mark Zuckerberg, responded to the accusations affected and nervous in the call with the shareholders this Monday. And it defended itself, claiming that “it is a coordinated effort to select only a part of all the documents that circulate within Facebook to show a false image about the company.” On the table, also the event scheduled for October 28 where, and under all rumors, the future of the company and its change of course would be discussed.

Collision of opinions between employees and leaders

A clear idea emerges from this coordinated coverage: there is a great distance between employees and the company's top executives. The company underestimates or directly ignores problems that it is magnifying or that it has created .

There are numerous examples that show employees sounding the alarm about how Facebook becomes a loudspeaker for extremism and misinformation; it incites violence and radicalizes and polarizes political discourse. Employees recognize when their platform turns harmful to society, and they plead with leaders to do more. But these, according to Haugen, ignore them.

Facebook's products “harm children, fuel division and weaken our democracy,” said the former employee. “The company should declare itself in ” moral bankruptcy “ if it wants to get ahead of all this.”

The top person in charge is Zuckerberg, who is portrayed in the leaked documents as an absolutist monarch within a publicly traded technology company that serves more than 3 billion people around the world. He owns the majority of the company's voting shares, controls its board of directors, and surrounds himself more and more with executives who don't seem to question his ambitious vision of “connecting the world” at any price.

Facebook: speaker of evil?

The assault on the capitol was a turning point in American democracy in which Facebook played a leading role. The rise of the group 'Stop the Steal' (stop the theft in English) within the social network triggered the shameful incident encouraged by former President Donald Trump.

One may think that it is obvious that people use Facebook to connect or organize events of any kind. And that little depends on the social network itself the objective of these. It is normal for individuals and communities to share content for extremist purposes or to incite violence or misinformation. This is true. But, according to the leaked documents, Facebook's role is not that of a simple passive broadcast medium ; but a catalyst of hatred. Before the rise of social media, it was difficult to spread messages like these quickly and effectively. And, also, because the company helps groups attract more users through its algorithm.

Former Facebook employee, Frances Haugen, during her appearance before the US Senate Committee on Commerce, Science and Transportation, Suffice one example: Facebook replaced likes with reactions related to human sensations such as love, laughter or anger. Precisely, posts tagged with feelings of anger were the most likely to include content that went against the standards as disinformation. And the company prioritizes showing this type of content to the detriment of publications that only include likes as a reaction. In other words, content that is angry, indignant and usually harmful to the community is not only not “hidden”, but is prioritized in the feed of its users because increase the time of use and interactions in the application.

Metrics such as religion

On Facebook and Instagram, the internal value that the algorithm gives to any publication is determined by the probability that the user will react to it. The higher that probability, the more options there will be for your feed to display. What is the content most likely to be shared? Complaints, misinformation, anger, calls for violence …

In 2018, Zuckerberg himself pointed out that the content with the most interactions was always the closest to the line of what was not allowed by the company.

Atlantic journalist Adrienne LaFrance spoke with several former employees who described a company culture based on extreme metrics, even by Silicon Valley standards.

“Facebook workers are under enormous pressure to quantitatively demonstrate their individual contributions to the company's growth goals,” he says in his article. “New products and features are not approved unless the employees proposing them demonstrate how they will drive engagement.”

This has fostered an internal collision that directly confronts its product and engineering team with the Integrity Team, in charge of mitigating the damage that the platform can generate in society.

The Facebook algorithm question

Photo by Glen Carrie on Unsplash The Facebook Papers imply that the company's main priority is to grow and squeeze every penny possible through its algorithms. Also, moderation efforts are insufficient to control all content ; and on many occasions they are not carried out because they directly conflict with their economic interests.

If the algorithm tends to spread out harmful speeches, why not remove it and display posts in chronological order so as not to prioritize certain potentially harmful content? One might think that it is not done because it would directly affect the time users spend on the social network. But, in reality, Facebook has already tried to do it and the experiment went wrong.

In 2018, the algorithm that gives priority to certain content in the News Feed of 0.5% of the users of the social network was turned off. The results of the experiment concluded that the interactions with the displayed content fell significantly; and that the content of the groups took on an even more prominent role. Surprisingly, Facebook benefited from this change because users had to spend more time to find something that interested them and, therefore, see more ads.

But “turning off” the algorithm, according to the researchers, offered a worse experience in all measurable areas.

A solution not as simple as they count

Mark Zuckerberg, CEO of Facebook Journalists have focused on describing the specific cases in which Facebook has failed; And their dangerous culture of prioritizing their growth over doing the right thing has been described. It is legitimate, and it is the job that the press has to do. However, to suggest that all problems are the result of the wrongdoing of an evil company run by bad people is bordering on stupid. It is a very dangerous simplification . This is something else. Moderating content is very difficult, and it is even more so when practically everyone is on Facebook.

With such articles, censorship is indirectly advocated; to moderate it more and better, they say; and to invite to hope that, one day, based on laws and hiding the content that they do not like, Facebook will do things well. But what is doing things right? Because the line between policing misinformation and censoring independent views can be very thin if we let regulation and algorithms decide.

Who manage who on Facebook? Furthermore, we cannot expect that a company whose main objective is to generate wealth will threaten its own businesses. And, on the other hand, we cannot expect either, as Zuckerberg asks – selflessly – that the laws will impose standards of restraint that only Google and Facebook can afford. In the call with the shareholders, he said that “we should want all the other companies in the industry to carry out and achieve the results that we have achieved.” This phrase is dangerous. They are praying for laws to be established that require an investment that only Facebook can afford to go in search of results that only they believe they have obtained.

Do we really want governments to decide how social networks should work? Because Facebook is causing problems in our society; but the remedy that some claim may be worse than the disease we want, and we must, heal.

The conflict not only stems from Zuckerberg's unbridled ambition; nor its obsession with metrics or insufficient efforts to moderate content, especially in developing countries like India. It also has its origin in the human being and in how society works. Now the crazy in the neighborhood becomes the crazy in the world , and we can't expect the companies to silence all of them. The illegal must be silenced, and that only corresponds to the law. Or at least it should.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top