goglphil.blogg.se

Facebook oversight
Facebook oversight











But the board translated the phrase slightly differently, as “hose male Muslims have something wrong in their mindset.” The board said its translators suggested that the terms used were not derogatory or violent, and ordered the post reinstated. Facebook has been accused of being used to stoke hatred of the Rohingya.įacebook removed the post because it translated its text to say there is “indeed something wrong with Muslims psychologically.” Given the criticism of its prior treatment of Rohingya, that was understandable, the board said. The Myanmar government has persecuted members of its Rohingya minority, who are Muslim, driving many into neighboring Bangladesh. In October, a user in Myanmar, writing in Burmese, posted photographs of a Syrian Kurdish child who drowned attempting to reach Europe in 2015, and contrasted the reaction to the photo to what the user said was a “lack of response by Muslims generally to the treatment of Uighur Muslims in China.” The older post had presumably been allowed to remain, raising questions about the consistency of Facebook’s standards for reviewing content.Ī decision on any one of those posts can be enormously complex. In the other case, involving a quote purportedly from Nazi propaganda chief Joseph Goebbels, Facebook’s memory feature had actually recommended that the user recirculate a post from two years earlier. To its credit, Facebook restored the post before the Oversight Board heard the case but it still underscores problems with letting algorithms do the work. But the post was an effort to raise awareness about breast cancer, an exception to Facebook’s general policy against nudity, and an issue that has bedeviled Facebook for a decade. In one, Facebook’s automated systems removed an Instagram post in Portuguese from a user in Brazil showing bare breasts and nipples. We’ve long known that the vast majority-now approaching 90 percent-of Facebook users are outside the US, but the breadth of these cases drives home the magnitude of Facebook’s challenge.įacebook has touted automation as one solution to that challenge, but these cases also highlight the shortcomings of algorithms. Two touch on deep-seated global conflicts: China’s oppression of Uighur Muslims and the ongoing border war between Armenia and Azerbaijan. The cases involve posts in five languages, and often, subtleties of meaning and interpretation. More than anything, though, they show the futility of moderating content across networks with more than 3 billion users-nearly half the people on earth. The rulings are well thought out and show the board members, charged with reviewing Facebook decisions to remove content and make recommendations on Facebook policies, take their job seriously. Rep.The Facebook Oversight Board issued its first five decisions Thursday. The American 'Great Resignation' by the numbers NYC to impose vaccine mandate for all city workers, including police In response to the board's criticism, Facebook promised Thursday it will "strive to be clearer in our explanations to them going forward." You may also like The board is now set to review Facebook's cross-check system and make recommendations on how it can be changed. In general, the board said transparency is "clearly an area where Facebook must urgently improve" and that the platform often isn't being clear with users about why their content is removed. The board also said Facebook has admitted it shouldn't have claimed the cross-check system only applies to a "small number of decisions" because this phrasing could be misleading.

facebook oversight facebook oversight

For example, the board said Facebook didn't initially mention this cross-check system when it asked for a review of the decision to suspend former President Donald Trump, an omission the board described as "not acceptable" given it was relevant to a case about how policies are enforced for political leaders.













Facebook oversight