Hikvision, Nvidia named in contract for 'Uyghur detection'
GPU giant says you can't stop secondary sales, surveillance gear maker maintains innocence
Updated Video surveillance equipment maker Hikvision was paid $6 million by the Chinese government last year to provide technology that could identify members of the nation's Uyghur people, a Muslim ethnic majority, according to physical security monitoring org IPVM.
The payment was documented in a contract between China-based Hikvision and the Hainan Province's Chengmai County, which was obtained by IPVM.
"While the People's Republic of China (PRC) has sharply restricted access to sensitive documents such as this one, this shows that persecution of Uyghur ethnic minorities is ongoing and that Hikvision, in what the authorities called its 'standard configuration,' can and does supply this human rights-abusing software," IPVM's researchers claimed last week.
Beijing regards the Uyghurs as a threat on grounds that their beliefs and affiliation with central Asian cultures represent a threat to Chinese sovereignty. Human rights groups assert that Uyghurs are surveilled, incarcerated, required to perform forced labor, re-educated to abandon their beliefs and cultural practices, and may even be subjected to sterilization campaigns.
Technology vendors are thought to know their wares assist in those efforts, and perhaps even to develop capabilities that enable Beijing's human rights abuses.
Hikvision earned itself a spot on the US blacklist in 2019 for allegedly being complicit in Beijing's suppression of the Uyghur population. Hikvision has denied being "knowingly" involved in human rights abuses.
Nvidia was also named in this latest contract, though the accelerator maker says it hasn't sold kit to Hikvision since 2019, and isn't involved in this contract.
"We aren't participating in this project and aren't aware of any Nvidia customer supplying it," a spokesperson for the silicon slinger told The Register.
"Our customers are aware that they aren't allowed to ship products in violation of the law or our policies."
"However, we aren't able to control used GPUs that resurface on the secondary market, sold by parties that aren't our customers or partners," the California-based biz added. "We don't have the ability to prohibit any third party from referencing Nvidia products in their marketing materials or design guides."
- Australia gives made-in-China CCTV cams the boot
- UK's Surveillance Camera Commissioner grills Hikvision on China human rights abuses
- Alibaba admits it built facial-recognition-as-a-service to detect oppressed Uyghur minority in China
- UK government to set deadline for removal of Chinese surveillance cams
The contract seen by IPVM, dated December 2022, is for the installation of 210 Hikvision cameras, drones, routers, and camera poles within three months.
The software specified is Hikvision's DS-IF0100-AI full analysis software, which supports facial, video, and human body analysis. The contract also includes analytics services to determine which category of ethnicity an individual falls into: "unknown," "non-minority," or "Uyghur."
The contract apparently also requires a server packing at least eight Nvidia T4 GPUs – a brand IPVM claims that Hikvision prefers. Nvidia pointed out the T4 has been on sale for the past five years, and thus in that time, some of the silicon may have been resold into China outside of Nv's conntrol.
The Register requested comment from Hikvision last week, and it had not responded at the time of publication days later.
Both Hikvision and Beijing have previously denied any involvement in human rights violations.
Hikvision has also previously claimed that features allowing the identification of Uyghurs was removed from its software in 2018. ®
Updated to add
Soon after we published this piece, a Hikvision spokesperson told The Register:
It is an undeniable fact that Hikvision offerings do not have a minority recognition function. As reported by The New York Times and Politiken, the company has not developed this capability since 2018. We have clear and longstanding policies in place to prohibit the use of minority recognition technology.