Despite Wall Street jitters, AI hopefuls keep spending billions on AI infrastructure

Sunk cost fallacy? No, I just need a little more cash for this AGI thing I’ve been working on

Comment Despite persistent worries that vast spending on AI infrastructure may not pay for itself, cloud providers, hyperscalers, and datacenter operators have continued to shovel billions of dollars into ever-larger GPU clusters.

Those who worry the world is spending too much, too fast on AI usually say there is thin evidence of machine-learning investments in this LLM-era producing substantial revenue or profit, and point out cautious corporate adoption. They also highlight DeepSeek’s claims that it could have trained its V3 model in the cloud for low millions of dollars, thanks to its efficient design, news that saw the value of AI-centric stocks slump. In reality, the Chinese lab spent a pretty penny on its own on-prem cluster of GPUs to build the model, though it still appears a more lightweight operation than its Western rivals while being roughly as capable.

That all said, plenty of investors remain optimistic.

OpenAI's $500 billion Stargate tie up with SoftBank, Oracle, MGX, and others was a massive vote of confidence in demand for AI infrastructure.

In the weeks following the mega project's announcement, we've seen a flurry of fresh investment in AI chip startups, datacenters that house GPUs, and model-making companies.

On Monday, Chinese e-commerce and cloud titan Alibaba announced plans to invest 380 billion Yuan (about $53 billion) in cloud and AI infrastructure over the next three years to fuel the development of artificial general intelligence AGI.

Alongside its investment plans, Alibaba rolled out a bigger more capable "thinking" model to challenge DeepSeek's R1, OpenAI's o3-mini, and Anthropic's new Claude Sonnet 3.7, which also launched Monday.

Speaking of Anthropic, the Wall Street Journal reports the startup is currently in the process of finalizing a $3.5 billion funding round which would value it at $61.5 billion.

Monday's announcements follow a flurry of investment in smaller GPU neo-cloud providers, which have specialized building AI training grounds containing tens of thousands of accelerators available to rent at cut-rate prices — so long as you commit to a long-term contract.

Last week, GPU cloud provider Lambda won a $480 million series-D funding round. The money will be used to pack its data halls with Nvidia's latest GPUs. The funding round means the cloud player has now raised $1.4 billion.

Just a day later, AI cloud services startup Together AI walked off with $305 million from the likes of General Catalyst and Prosperity7.

In a release, Together AI claimed it'd secured 200 megawatts of datacenter capacity which it intends to fill with yet more of Nvidia's flashy new Blackwell GPUs. Earlier this month the startup announced availability of Nvidia's B200-based systems and is working to deploy a cluster of 36,000 GB200 GPUs in partnership with Hypertech.

Apple is also pushing ahead with its plans to pack its datacenters with AI servers powered by its custom silicon as part of a four-year $500 billion commitment to bolster US manufacturing and R&D. Cupertino’s AI machines are designed to support the iGiant's AI-infused software experiences by offloading workloads deemed too intense to run locally on iDevices.

The news isn’t all good. As we recently reported, Microsoft has reportedly walked away from several high-capacity datacenter leases, sowing fear among investors that one of the AI boom's biggest cheerleaders may have overestimated demand. That's despite Microsoft CEO Satya Nadella's comments a few weeks back that he's good for his $80 billion contribution to the Stargate project.

DeepSeek also continues to worry many, but American tech players, including Google DeepMind's Demis Hassabis have since called many of the Chinese company’s claims into question.

Microsoft’s Nadella and Meta chief Mark Zuckerberg have continued to insist that additional compute infrastructure is essential, both to power the inferencing workloads that put models to work and to push for artificial general intelligence.

This week will bring another major data point when Nvidia announces its quarterly earnings, giving investors a chance to assuage their fears or justify their unease. ®

More about

TIP US OFF

Send us news


Other stories you might like