Facebook developed a system that largely exempts numerous celebrities, politicians or journalists from complying with its user rules, according to leaked internal documents published Monday by The Wall Street Journal (WSJ).
The program, known within the company as “XCheck”, was initially created as a mechanism to review in more detail the measures taken against high-profile accounts, but ended up protecting many users with a relevant profile from the rules that are imposed on other users, notes the New York newspaper.
According to the documents, on occasions “XCheck” has shielded personalities whose publications contained incitement to violence or harassment, content that normally cost other users sanctions such as the closure of their accounts.
As an example, the WSJ points to the case of Brazilian footballer Neymar, who in 2019 posted on Facebook nude photos of a woman who had accused him of rape in an attempt to defend himself and which were seen by millions of people before the social network deleted them and decided not to act against the player’s profile.
The documents also note that some of these VIP accounts have without consequence shared content that Facebook’s fact-checkers deemed false, from that vaccines are deadly to words falsely attributed to former President Donald Trump calling all asylum seekers “animals.”
"*" indicates required fields
In total, at least 5.8 million users were included in 2020 in “XCheck,” contradicting Facebook’s claims that the program covered a small number of people.
In general, moderation on the platform created by Mark Zuckerberg uses automated systems to detect violations of its rules against harassment, sexual content, hate speech or incitement to violence.
In some cases, content is automatically removed, while in others, moderators from external companies hired by Facebook are in charge of analyzing messages, photos or videos detected by these systems or reported by users.
The accounts included in “XCheck”, however, are treated more favorably in these cases and the moderators cannot remove content immediately, but the analysis is passed on to Facebook employees and, on occasion, to senior executives.
According to the WSJ, in many instances this leads to no action being taken on problematic content by celebrities.
In documents obtained by the WSJ, Facebook acknowledges the problems with this system and has tried to modify the program, but the number of privileged accounts has continued to rise.
Andy Stone, a spokesman for the company, responded to the newspaper’s information by denying via Twitter that there are two categories of users and defending the decision to make a second review of the contents of some relevant accounts to avoid errors.
According to Stone, the only thing the internal documents show is that Facebook wants to improve this program and stressed that this is what it has been doing in recent years.