This article is more than 1 year old
Huawei and top Chinese AI startup accused of building 'Uyghur alarm' facial recognition scanner for govt
Plus: Graphcore pits its latest AI chips against Nvidia's A100, and Google CEO apologises for ousting a top AI ethics researcher
In brief Huawei, already sanctioned by the US for helping the Chinese government crack down on Uyghur muslims, built facial recognition software to surveil the ethnic group and alert authorities whenever a positive match was detected, it's claimed.
The company collaborated with AI startup Megvii to test a camera system that predicts people’s age, sex, and ethnicity based on their face. One of the functions the software was capable of performing was labelled as “Uyghur alarm”, according to documents (which you can see here) shared with the Washington Post.
The feature, tested in 2018, allegedly sends out an alert signal to police in China whenever its cameras positively identify someone as a Uyghur. There is ample evidence that the Chinese Communist Party monitors the Muslim population in regions like Xinjiang and exerts control using technology like facial recognition.
Huawei said it was "simply a test and it has not seen real-world application. Huawei only supplies general-purpose products for this kind of testing. We do not provide custom algorithms or applications.” Megvii claimed its systems was not designed to target specific ethnic groups.
Google CEO wades in on the company’s ethical AI drama
The Chocolate Factory has a massive PR crisis on its hands after it recently parted ways with a prominent black female AI researcher. There’s been so much backlash online that CEO Sundar Pichai announced this week he would investigate how the situation was handled.
In a company-wide email published by Axios, he apologized for the stress it has caused internally and in other communities elsewhere. Timnit Gebru left her role as manager on Google’s ethical AI team after she was told to remove her name from a research paper on giant language models.
The conditions under which she left are still murky. Google said she resigned; Gebru said she was fired.
“First - we need to assess the circumstances that led up to Dr. Gebru’s departure, examining where we could have improved and led a more respectful process,” Pichai’s email read. “Second - we need to accept responsibility for the fact that a prominent Black, female leader with immense talent left Google unhappily.”
Graphcore claims its latest AI chips are better than Nvidia’s A100s
The British AI startup has released a set of benchmark test results for a variety of machine learning tasks, comparing its latest second-generation Intelligence Processing Unit (IPU) against Nvidia’s A100.
Graphcore has claimed that its product is faster at training large natural language models, computer vision systems, text-to-speech neural networks, and at performing inference too. You can see the full list of comparisons here. Take note, however, that the number of IPUs often differs between chips.
Do you want to help companies use TensorFlow?
Google has launched TensorFlow AI Service Partners, a new service that connects expert TensorFlow coders with enterprises that want to roll out AI technology.
The partnership is essentially a consulting service offered by Google. It will look at what tools businesses are trying to build, and work out what data it has, what models are best suited for a particular application, and how to deploy them.
Google is looking for partners to help support the service. Some startups that provide cloud computing or data labeling, for example, have signed up so far. ®