[Leer en español]
If there is an ethnic group that has suffered the incessant persecution of the Chinese Communist Party (CCP), it is the Uyghurs. This Muslim minority has been tracked down and beaten down by the Chinese regime in the region of Xinjiang in northwest China. But there is something particular in the case of the Uyghurs, and that is how companies are developing and testing Artificial Intelligence (AI) technologies that serve to detect Uyghurs. It was recently revealed Huawei did it, and now it’s Alibaba turn.
According to a The New York Times report, an Alibaba website -from its cloud computing business- showed “how to use facial recognition software to detect Uyghurs or other ethnic minorities in photos and videos.
“This feature was incorporated into Alibaba’s software that helps platforms monitor digital content for material related to terrorism, pornography and other alert categories,” the article says.
The information was collected from Alibaba website pages that were discovered by IPVM and shared with the Times.
The explanations found on the website were deleted. Alibaba excused himself by mentioning that the “facial recognition feature was only used on a trial basis. But the company also edited its website to remove references to Uyghurs and minorities.
The Alibaba company, which is a leader in online commerce and was founded by Jack Ma – one of the richest people in China – has stood out and diversified in recent years in sectors such as computers, supermarkets and cinema.
The revelations of the New York newspaper “could be a blow to the credibility of the company, whose shares are quoted on the New York Stock Exchange and are owned by important international investors”, reads the Argentinean portal Infobae.
Alibaba and Huawei, cases that should alarm the world
Human rights activists around the world are concerned about the implementation of this type of AI that is used, unequivocally, to track or detect people of minority groups or races.
When giant Chinese companies, such as Huawei and Alibaba, test software designed and developed for tracking purposes, in a country where the ruling regime has systematically persecuted such a minority, the situation becomes delicate.
For example, Maya Wang, China’s senior researcher with the advocacy group Human Rights Watch, has said that “the country (China) has increasingly used AI-assisted surveillance to closely monitor the general public and oppress minorities, protesters, and others deemed to be threats to the State.”
While the visualization around this issue seems relatively low, there are political and cultural battles going on.
In the cultural and media field, for example, there are two concrete cases. One is that of the German soccer player with Turkish roots, Mesut Ozil, who at the beginning of the year demonstrated against the Chinese Communist Party and that brought immediate consequences to him in terms of sports and image. The other case is that of French soccer player Antoine Griezmann, who immediately cut off business relations with Huawei after it became known that the company tested software that could send off a “Uyghur alarm” to the Chinese police.
And in the political field, it can be mentioned that, during 2019, the U.S. government added 28 Chinese entities to a commercial blacklist because of concerns about their role in the repression against the Uyghurs. On this list are manufacturers of surveillance equipment and new Artificial Intelligence companies.