‘Automated racism’ — facial recognition software targets Uyghurs

Domestic News

In a front-page story in today’s New York Times print edition (link to newspaper frontlink to story online), tech reporter Paul Mozur describes “the first known example of a government intentionally using artificial intelligence for racial profiling.” It is, of course, the facial recognition systems employed by security officials across China, and the race targeted is Uyghurs.

“Five people with direct knowledge of the systems” talked to the New York Times, which also “reviewed databases used by the police, government procurement documents and advertising materials distributed by the A.I. companies that make the systems.”

“Integrated into China’s rapidly expanding networks of surveillance cameras,” the technology identifies Uyghurs and “keeps records of their comings and goings for search and review,” a practice that the Times says may usher in “a new era of automated racism.”

It’s not clear how many of China’s 11 million Uyghurs are being targeted at this stage, or where the technology has been implemented. But the Times says “police are now using facial recognition technology to target Uyghurs in wealthy eastern cities like Hangzhou and Wenzhou and across the coastal province of Fujian.” Elsewhere, authorities “in the central Chinese city of Sanmenxia, along the Yellow River, ran a system that over the course of a month this year screened whether residents were Uyghurs 500,000 times.”

Demand for the Uyghur identification tech is growing: “Almost two dozen police departments in 16 different provinces and regions across China sought such technology beginning in 2018, according to procurement documents. Law enforcement from the central province of Shaanxi, for example, aimed to acquire a smart camera system last year that ‘should support facial recognition to identify Uyghur/non-Uyghur attributes.'”

Paul Mozur also posted a Twitter thread with some extra information on what he calls “a massive ethical leap for AI.” Besides ethnicity, he notes that facial recognition technology is being used to screen for “faces of the mentally ill, drug users, and petitioners,” among other groups.

SenseTime, one of five billion-dollar companies (Yitu, Megvii, SenseTime, CloudWalk, and Hikvision) mentioned in Mozur’s article that has worked on developing this technology, is now backing away from its most controversial application in Xinjiang. Commenting that this is “the first time a major Chinese technology [company] has opted out of operations in the region,” the Financial Times reports (paywall):

SenseTime, a facial recognition software company that supplies Chinese police, set up a “smart policing” company with Leon, a major supplier of data analysis and surveillance technology in Xinjiang, in November 2017.

It has now sold its 51 percent stake in the joint venture, Tangli Technology, to Leon, which said Tangli would continue with its strategy and that its research team had mastered key technologies.

Related stories:

In the confines of a cramped, dark room, Aigerim was kicked repeatedly in the stomach by a guard wearing heavy, metal-tipped boots. With her mouth taped shut and limbs chained, she couldn’t cry out in pain or block the blows.

When Chinese state authorities prepared to release Gulbahar Jelil, an ethnic Uyghur woman born and raised in Kazakhstan, they told her that she was forbidden to tell anyone about what she had experienced over the one year, three months, and 10 days in which she was detained…

She didn’t listen.

About 3,000 Uyghurs have found sanctuary in Australia. But as some of them draw attention to China’s camps, they are putting their adopted homeland in an awkward position, pressing it to speak out against its largest trading partner. More than a dozen Uyghurs who are Australian permanent residents are missing in China and presumed to be in detention, activists say.