According to the Wall Street Journal, Facebook has a secret internal system that exempts 5.8 million users from the obligation to abide by the rules on its platform. The newspaper published a survey on Monday detailing how high profile users of its services, who are “newsworthy,” “influential or popular, or” risky in public relations, “are not being targeted. same enforcement measures as ordinary users, citing internal Facebook documents that it has consulted. Facebook therefore developed a program called “XCheck” or “cross check”, which in many cases has become a whitelist De facto Over the years, XCheck has allowed celebrities, politicians, athletes, activists and journalists to publish what they want, with little or no consequence for breaching company rules. For a few select members of our community, we do not enforce our policies and standards Unlike the rest of our community, these people can violate our standards without any consequences, reads a leaked internal Facebook report. minus 5.8 million people were enrolled in the XCheck program last year. This means that a large number of influencers are allowed to post largely uncontrollably on Facebook and Instagram.
The social media glove acknowledged that the program existed for politicians in 2019, but details of the extent and mismanagement of XCheck are new. Facebook is apparently aware of XCheck’s problems, according to excerpts from the report, but the company has struggled to resolve the issues. The XCheck program was created to prevent “PR fires” – the negative public reactions that occur when Facebook makes a mistake on the account of a high profile user.
When typical users post something that is flagged by moderation algorithms or removed by human moderators, they can file a report with Facebook. If they don’t win, there’s not much else they can do.
But when underage users are moderate, they can report their complaints to subscribers, which creates a potential public relations problem. If the users are politicians, they can ask for increased regulation of the platform. So, rather than treating these elite users the same as everyone else, Facebook allows them to post whatever they want. If a message is signaled by an algorithm, it is sent to another group of moderators, who are better trained full-time employees, for review.
Yet as the roster of users grew, XCheck’s moderation teams apparently weren’t able to keep pace. We are currently reviewing less than 10% of Xcheck content, one document says. The situation is such that elite users would have been able to post all kinds of content, from misinformation to threats of violence, revenge pornography, and much more, thus allowing messages that nonetheless violate Facebook’s policies to appear on the news feeds of thousands or millions of people.
Special treatment for VIP users
Even in cases where content is ultimately removed, Facebook treats VIP users differently from others. In the leaked documents, the case of Neymar, the Brazilian soccer player, stands out. In 2019, he posted a video on his Facebook and Instagram accounts containing nude photos of a woman who had accused him of rape. He said she was extorting him.
For normal users, posting non-consensual intimate images triggers a direct reaction: it is immediately deleted and the affected person’s account is deactivated. Instead, Neymar’s video stayed online for over a day. The usual moderators couldn’t touch it, and when it was pulled by the XCheck team, 56 million people had seen it. The video was re-edited 6,000 times, and many commentators intimidated and harassed the woman. Neymar has neither the allegations of rape and no complaints have been filed. But despite posting what Facebook called “vengeful porn”, Neymar’s account was not deleted.
Last year alone, the cross-checking system allowed content that violated the rules to be viewed more than 16 billion times before being deleted, according to internal Facebook documents cited by the Wall Street Journal. The report also says Facebook “misled” its supervisory board, which insisted on its cross-checking system in June when it questioned how the company should handle Donald Trump’s indefinite suspension. The company then told the board that the system affected only a “small number” of its decisions and that it was “not feasible” to share more data. The Supervisory Board has repeatedly expressed its concern about the lack of transparency in Facebook’s content moderation processes, particularly with regard to the company’s inconsistent management of highly mediated accounts. The Board has repeatedly recommended that Facebook be much more transparent in general, especially with regard to its management of high profile accounts, while ensuring that its policies treat all users fairly, the Supervisory Board said in a statement. sharing on Twitter.
In a series of tweets, Facebook chief communications officer Andy Stone pointed out how the company already publicized its “Cross-check” system in 2018. Ultimately, at the center of this story is the Facebook’s own analysis that we need to improve the program. We know that our application is not perfect and that there are tradeoffs between speed and accuracy. We have new teams, new resources, and a redesign of the process which is an existing workflow at Facebook, Stone tweeted.
Mark Zuckerberg has long touted one of his mythical slogans: Facebook executives don’t want the platform to be the “arbiter of truth” or to decide what is right and wrong and then let it go. in place or removes the content accordingly. But this hands-off approach has put Facebook in a pinchy position, especially in recent years, as critics say misinformation is rampant on the site and some Republicans crusade the company for serving a liberal agenda and discriminating against them. curators online. Facebook has taken several steps in light of this scrutiny: Reports surfaced in June that Facebook would stop giving politicians special treatment for enforcing its content rules.
And you ?
What is your opinion on the subject?
See as well :
Facebook is preventing its employees from reading an internal report that examines the role and failings of the social network that led to the Capitol riot. You can read it here
Mark Zuckerberg reportedly quietly approved a change in Facebook’s moderation algorithm that reduced traffic to progressive news sites
Content moderation: Facebook appoints the first members of the independent Supervisory Board, who will be able to overturn decisions taken by the company or the CEO, Mark Zuckerberg