This article is more than 1 year old
Microsoft president: We said no to Cali cops' face-recog tech – and we won't craft killer robots
Why? Because biased AI is bad news for minorities
Microsoft president Brad Smith has revealed that the company turned down an order from California cops for its facial recognition technology over human rights concerns.
Speaking at Stanford University this week, Smith said Microsoft had concluded that its technology would probably lead to innocent women and minorities being held by the police thanks to inherent biases in the cops' artificial intelligence dataset.
In other words, because the police database's AI has been trained largely on photos of white males, it was more likely to provide false positives on anyone that wasn't white and male.
"Anytime they pulled anyone over, they wanted to run a face scan", Smith said to attendees at the "Future of human-centered AI" conference, explaining that that scan would then be run against a database of police suspects. Microsoft went away and concluded this was likely to create problems, returned, and told the unnamed organization "this technology is not your answer," according to Smith.
Smith was giving examples of where Microsoft had followed through on its principles of "having the courage to say No, and to walk away from deals when we know our tech and AI products will be used to harm any segment of human life - even if it means being unpopular in our company or outside Microsoft."
Another example he cited, according to Reuters, was turning down a deal to install its facial recognition software on cameras that cover the capital city of a country. The decision was largely made due to the fact that the country has been flagged as being "not free" by independent watchdog Freedom House and Microsoft was concerned its technology would be used to squash freedom of assembly.
Let's see
Smith didn't name the country but Freedom House's map show it would most likely have been a country in Southeast Asia, the Middle East or Africa.
The software executive shared the stage with the United Nations High Commissioner for Human Rights, Michelle Bachelet – a former president of Chile – who argued that ethical codes were not sufficient and companies had to ground their work in human rights.
But Bachelet also had a dig at the lack of diversity in Silicon Valley, noting that "tech companies need staff who represent the world - women, African-Americans - in order to prevent their tools from exacerbating bias and discrimination."
Smith acknowledged – as the tech sector does once-a-year, every year when its dreadful diversity stats are published – that Microsoft "has a long way to go to achieve a diverse and effective team." But he agreed that the answer to avoiding biased AI was to build a diverse team.
Smith said that Microsoft had agreed to supply its facial recognition technology to an American jail, having reached the conclusion that it was a self-contained environment and would likely improve safety.
Broken China
But he sidestepped a question about a report that said Microsoft was working on AI with a university that it run by the Chinese military. Three AI papers co-authored between Microsoft researcher and researchers from China’s National University of Defense Technology have been published, including one that digs into a new method of identifying where images are taken by analyzing human faces – something that some fear could give the Chinese authorities even greater surveillance capabilities.
Let's face it. We need to face up to facing off with face-recog tech, say US senators: Bipartisan AI privacy law proposed
READ MORESmith played down the paper as covering "basic advances in machine learning" and – as one attendee noted – threw in a strawman argument to avoid further discussion by arguing that "it doesn't mean we should stop doing research, or work with people in China."
Smith also agreed that there was a need for more regulation around facial recognition and other forms of AI, arguing that without it data would become a commodity and "companies would race to be the first to market to leverage that commodity into dollars, embarking on a race to the bottom."
Notably, however, Smith did not rule out selling facial recognition to the police in future. Although he appeared to draw a firm line around selling its tech to anyone that intended to use it with "autonomous weapons" i.e. killer robots. ®