This article is more than 1 year old

President Biden urged to appoint AI officers to regulate this shiny-shiny tech

And a pin to pop this hype bubble would be nice, too

The US National Artificial Intelligence Advisory Committee has urged President Joe Biden to fill key positions and create new organizations to address rising societal concerns with AI models, in its upcoming first report. 

The US government has been criticized for being slow to regulate the technology, and has been lagging behind the European Union and China in drafting policies and laws to tackle safety risks. 

But recent actions – such as the US Department of Commerce issuing a formal request for public comments on how to audit ML algorithms; the Federal Trade Commission threatening to punish companies that use AI to dupe citizens or allow biases in neural networks to trample their civil rights; and talk of bipartisan legislation from leading senators – suggest the tide is turning.

While leaders figure out what new rules need to be in place to tackle growing issues over the bias, privacy, discrimination, labor displacement and more, the government needs to restructure its own workforce to better address these challenges. 

Specifically, the NAIAC recommended the president immediately appoint a director of the National AI Initiative Office, designed to coordinate on all matters related to AI between different agencies, and a chief technology officer in the White House, in its draft report [PDF]. 

The role of chief responsible AI officer should also be created, to find a leader capable of implementing and advancing strategies to develop trustworthy AI, the report argued. Biden was also advised to launch the Emerging Technology Council – a group made up of senior White House members – to drive technology policy focusing on civil rights and equity, the economy and national security. 

Finally, a multi-agency task force is also required to help small and medium-sized organizations, who might have less resources than larger enterprises, to design and deploy AI safely. 

On Wednesday during a live discussion held by the Brookings Institute, Miriam Vogel, a member serving on the NAIAC and president and CEO of non-profit EqualAI, said that filling those positions would propel efforts to regulate AI. 

"Supporting parts of government that are in charge of that enforcement, making sure that they're sufficiently resourced, that the leadership positions within this area are filled and appropriately resourced … I do think that's a first step in that direction," she said. Vogel also noted that the US already has existing laws in place that can tackle some issues like civil rights that could be violated by algorithms perpetuating biases and discrimination in areas like employment or finance.

"We've started to see litigation," she added. "We've seen several regulatory bodies talk about the fact that there's going to be more regulation, more litigation in the space." She pointed to joint statements issued by the "EEOC, and the DOJ, the Department of Labour, the Consumer Financial Protection Bureau and so on – all the alphabet soup of the US" threatening to crack down on companies using biased software to make decisions. 

New legislation, however, needs to be introduced to mitigate potential dangers and risks. More powerful AI models are being built and deployed across areas like education and healthcare. The draft of the NAIAC report was released this week, but is still being finalized.

"I think enforcement is going to play a part at some point," Reggie Townsend, a member of the NAIAC and vice president of the Data Ethics Practice at analytics biz SAS, said at the Brookings Institute event. "But first, you got to start with rules."

"There are a lot of folks around the world literally who are trying to figure this stuff out for the first time. So we do have to extend a little bit of grace as we try to figure some of this stuff out. So that we don't put structures in place that have unintended consequences that are every bit as harmful as those that we're attempting to avoid," he concluded. ®

More about


Send us news

Other stories you might like