This article is more than 1 year old

US cities and towns purchase AI surveillance kit linked to China's Uyghur abuse

Plus: Ex-OpenAI employees launch machine-learning research org

In brief Nearly a hundred counties, towns and cities across the US have purchased surveillance cameras from Chinese companies blocked by the federal government over the human rights abuse of Uyghur Muslims in Xinjiang.

Companies like Hikvision and Dahua were placed on the Department of Commerce’s Entity List back in 2019. Federal authorities are prohibited from exporting goods or services to these entities without governmental permission, but local states and municipal authorities are allowed to purchase products although it’s not recommended.

For example, the Board of Education in Fayette County, Georgia, splashed $490,000 on numerous Hikvision thermal cameras in public schools since the COVID-19 pandemic, according to TechCrunch.

IVPM, a company focused on researching and testing video surveillance equipment, has compiled a map to show where these cameras are used. You can see that here.

New $100m fund for AI startups managed by OpenAI

OpenAI and Microsoft have launched a $100m fund to invest in novel AI startups and will support them by providing discounted cloud compute on Azure and access to new machine learning tools.

The fund will be managed by OpenAI, and the money will be provided by Microsoft and other partners, who have previously invested in the San Francisco-based biz. “We’re looking to partner with a small number of early-stage startups in fields where artificial intelligence can have a transformative effect—like health care, climate change, and education—and where AI tools can empower people by helping them be more productive,” OpenAI said this week.

Startups can start applying now. Competition will be fierce, OpenAI’s CEO Sam Altman said the fund probably won’t invest in more than ten companies or so. The financial details of any future deals weren’t disclosed.

You can submit an application here.

Ex-OpenAI employees left to start a new AI research org

Hey, remember before OpenAI teamed up with Microsoft and became all corporate? Well, a group of workers who left the company last year sure do.

After a reorg was announced, Dario Amodei, VP of engineering at OpenAI, exited the organization, as well as a few others including policy director and ex-Register vulture Jack Clark.

Now, those folks have launched a public-benefit corporation also focused on AI research called Anthropic. “Anthropic’s goal is to make the fundamental research advances that will let us build more capable, general, and reliable AI systems, then deploy these systems in a way that benefits people,” Amodei, now its CEO, said in a statement.

Anthropic raised $124m in series-A funding. The round was led by Jaan Tallinn, co-founder of Skype, plus James McClave; Dustin Moskovitz, the Center for Emerging Risk Research and CEO of Asana; ex-Google CEO Eric Schmidt, and others. The company filed for registration earlier this year in February and was officially recognized in May [PDF].

Google has signed a deal with US hospital chain to develop AI algorithms for healthcare

Private health corporation HCA Healthcare has entered a multi-year partnership with Google to develop new medical algorithms and run its software on the Silicon Valley giant’s cloud platform.

“The partnership will utilize Google Cloud’s healthcare data offerings, including the Google Cloud Healthcare API and BigQuery, a planetary-scale database with full support for HL7v2 and FHIRv4 data standards, as well as HIPAA compliance,” Google announced this week.

“Google Cloud’s data, analytics, and AI offerings will power custom solutions for clinical and operational settings, built in partnership with Google Cloud’s Office of the CTO and Google Cloud Professional Services.”

Patient data is sensitive, and HCA said that it will make sure details are kept private and secure on the company’s servers. But not everyone’s convinced, Arthur Kaplan, a medical ethics expert, told CNBC that privacy laws in the US have to be strengthened to ensure that data is not misused.

“Maybe they don’t have your name, but they sure enough can figure out what sub-group, sub-population might do best by getting advertised to you,” he said. ®

Editor's note: This article was revised after publication to clarify that Anthropic is a public-benefit corp.

More about

TIP US OFF

Send us news


Other stories you might like