This article is more than 1 year old

Google's ex-boss tells the US it's time to take the gloves off on autonomous weapons

Plus: AI Index 2021 report takeaways, Chocolate Factory banished from top ethics conference, and more

In brief US government should avoid hastily banning AI-powered autonomous weapons and instead step up its efforts in developing such systems to keep up with foreign enemies, according to the National Security Commission on AI.

The independent group headed by ex-Google CEO Eric Schmidt and funded by the Department of Defense has published its final report advising the White House on how best to advance AI and machine learning to stay ahead of its competitors.

Stretching over 750 pages, the report covers a lot of areas, including retaining talent, the future of warfare, protecting IP, and US semiconductor supply chains.

The most controversial point raised by Schmidt and the other advisors was that America should not turn its back on autonomous AI weapons. The US government should actually be building its own systems to deter other countries from wreaking havoc, it argued. But the development should be carefully monitored to make sure it abides by ethical policies.

The report also urged the White House to encourage international cooperation with countries like Russia and China to develop strict rules on how to build and test autonomous AI weapons. You can read the full report here [PDF].

Read about the latest trends in the machine learning community in the AI Index

Here's another report to sink your teeth into. Scholars teamed up with Stanford University's Institute for Human-Centered AI to chart the most important tidbits of the technology's impact in academia, industry, and the economy.

The combined effort culminates in an annual AI Index report, which is now in its fourth year. Some of the most interesting takeaways include things like:

  • China, for the first time, has taken over the US in terms of how many times their academic papers have been cited by others and has been publishing more papers than the US since 2017.
  • Fake content generated by machine-learning algorithms is getting more and more convincing. That might be a good thing when it comes to computer graphics, but it's bad for us humans, who are having a hard time telling what's real and fake.
  • Natural language processing is experiencing a boom. Academics have pushed the field with things like transformers, the architecture powering some of the most powerful text-generation models.
  • The percentage of women getting PhDs (18 per cent) or tenure (16 per cent) in an AI-related area has stayed the same over the last decade.
  • Investment in using machine learning for discovering and designing new drugs has shot up during the current COVID-19 pandemic.

You can read the whole 2021 report here.

Google has been dropped as a sponsor from the leading AI ethics conference

The organizers on the executive committee managing this year's Conference on Fairness, Accountability, and Transparency (FAccT) have decided to remove Google as a sponsor.

The academic community is still reeling at Google's controversial decision to fire its two ethical AI co-leads, Timnit Gebru and Margaret Mitchell, and reshuffle the whole department. The drama centers around Google having strict control over what researchers can and can't publish. One complaint was that company lawyers tended to tone down negative language or remove references to Google's own tech's shortcomings.

Michael Ekstrand, an assistant professor at Boise State University and co-chair of FAccT, told The Register Google's sponsorship has been temporarily paused this year.

"We concluded it was in the best interests of the community to pause the sponsorship relationship with Google while we revisit our sponsorship policy for next year," he said. The conference is currently taking place virtually until 10 March.

Help convert skeletal chemical compounds squiggles to machine-readable chemical formula

Chemists have multiple ways of notating compounds. Sometimes a simple formula like CO2 is fine, sometimes they draw the structure to show how the individual atoms are bonded, and sometimes it's just a series of squiggles like in the skeletal formula.

The skeletal formula is more commonly applied in organic chemistry, where the lines represent carbon and hydrogen atoms. Although it looks fancy, machines can't really read and understand the sketches. It's difficult for researchers trying to trawl academic literature with AI algorithms to pull out the data they need when they want to do things like develop new drugs. Information represented in this way will be missed.

That's why Bristol-Myers Squibb, a pharmaceutical company headquartered in New York City, has teamed up with Kaggle to fund a new competition that challenges data scientists to develop software that converts skeletal formulas into readable text.

Specifically, it wants the formulas to be written in International Chemical Identifier (InChI) speak. InChl describes a way of naming chemical compounds that makes it easy to search for in databases based on a set of agreed upon principles.

Winners will receive a cash prize of $25,000 (£18,080) second place will get $15,000 (£10,850), and third place should scoop $10,000 (£7,230). ®

More about

TIP US OFF

Send us news


Other stories you might like