Facebook rejects claims regarding weak supervision system

 Facebook rejects claims regarding weak supervision system

Facebook, the leading social media application has an automated system that identifies and censors all of its hateful speech and content that goes against the community guidelines.

The Wall Street Journal reported that the system removes only a handful of offensive content. However, Facebook, in its defense claimed that it has recently observed its system doing an excellent job at taking down a huge amount of restricted content from the app.

In the past few months, an internal investigation found out that Facebook removes around 3-5% of the prohibited content; moreover, the leaked information by the company’s employees revealed that the company is successful in supervising only a small percentage of its online information , as per WSJ’s details.

According to the recent hearsay, it is believed that Facebook’s team for managing reported issues has stopped seeing such requests.

Facebook, on the other hand, has vehemently denied all such allegations, instead believes that its system is doing its best at the job of removing posts going against its own guidelines.

Related post

Ads Blocker Image Powered by Code Help Pro
Ads Blocker Detected!!!

We have detected that you are using extensions to block ads. Please support us by disabling these ads blocker.

I Have Disabled The AdBlock Reload