Facebook: “Private companies like ours shouldn’t make so many complex decisions on their own”

Tribune. While the Internet has revolutionized the world over the past twenty years, the digital revolution has been accompanied by the emergence of many new challenges. Incredibly complex issues have arisen in the public debate, and it is perfectly legitimate that companies like Meta be expected to be accountable for how they deal with issues such as content moderation or content management. role of algorithms. But it is wrong to claim that our company derives any benefit from hatred or that it places its profits above the protection of people.

Read Philippe Escande’s column: Article reserved for our subscribers “Facebook is an aging network looking for a makeover. He thinks he has found her in a parallel universe ”

We have absolutely no economic interest in maintaining harmful content on our platforms. Billions of people use Facebook and Instagram because they have positive experiences there. Neither our users nor the advertisers who serve ads there want to see hateful content. Our investments in the face of this are unparalleled: for this year 2021 alone, we will have spent more than $ 5 billion [environ 4,3 milliards d’euros] to protect our users; it’s more than any other tech company. Today we have over 40,000 people working on this essential mission, year after year, in more countries and languages ​​around the world.

Read also Article reserved for our subscribers Facebook Files: outside the United States, the weaknesses of moderation in dozens of languages

These massive efforts are paying off. Hate speech now accounts for less than 0.05% of the content that users see on Facebook. Over the past three quarters, that number has almost halved. Today we detect over 97% of the hateful content that we delete before anyone even reports it to us. Of course, our action will probably never be perfect, but while no one today has the solution to eradicating all of this hate speech from the Internet, the figures we publish each quarter bear witness to our significant progress in this area.

Societal impact

We are also questioned about the algorithms we use to classify the content that our users see on our platforms. I want to be clear on this point: to say that we design these algorithms to promote content that is sensational or that engenders feelings of anger is perfectly wrong. This would also be economic nonsense for our company and would go against the expectations of our users and advertisers who do not want their advertisements alongside this type of content.

You have 59.12% of this article left to read. The rest is for subscribers only.

We want to give thanks to the writer of this write-up for this incredible web content

Facebook: “Private companies like ours shouldn’t make so many complex decisions on their own”

The Inside News Hyderabad