Stargate, smargate. We're spending $60B+ on AI this year, Meta's Zuckerberg boasts
Can't keep the drama Llama out of this race
Meta CEO Mark Zuckerberg revealed plans on Friday to blow through as much as $60 to 65 billion in 2025 on plenty more AI resources for his social media mega-corp – and signaled his intention to continue the spending spree for years to come.
The announcement comes just days after rival model builder OpenAI unveiled, in a briefing with US President Trump, an alleged $500 billion AI infrastructure project for itself called Stargate alongside partners SoftBank, Oracle, and MGX.
The Stargate project quickly drew skepticism from the likes of xAI founder and US Department of Government Efficiency (DOGE) head Elon Musk, who openly questioned whether OpenAI CEO Sam Altman and his pals really had the cash to make it happen. The OpenAI chief rubbished those claims. Given the history and animosity between Musk and Altman, none of this is a surprise, though seeing Musk (a big-name player in the Trump administration) publicly trashing Stargate (a project encouraged by Trump) like a jealous teenager has given us a great view of the nation's stable geniuses.
Then there's Microsoft, Google, and Amazon each separately pledging to spend tens of billions of dollars – at least $200 billion between them – on building out AI infrastructure over the next year or so to inject assistants, generative models, and more into their products.
Clearly, Zuck wants in on this drama. In a Facebook post early Friday, he outlined his vision to make Meta's AI "the leading assistant serving more than a billion people," and claimed that his boffins' Llama 4 model would become the "leading state of the art model."
Is Mark rattled by the readily available and highly capable DeepSeek model out of China that's getting machine-learning types excited, or the rise of super labs like Anthropic and OpenAI, or Google's persistence paying off with Gemini? Maybe.
Like its competitors, Meta has no problem burning cash to get ahead in this artificial intelligence race. According to Zuckerberg, the Social Network's $60-65 billion in CAPEX spending will support the deployment of roughly a gigawatt of new compute capacity with more than 1.3 million GPUs training and serving models by the end of the year.
It's 2023, let's check in with the Metaverse... Nope, still doesn't exist
FLASHBACKWhile Zuckerberg didn't go into detail as to whose GPUs he'll be decking his data halls with, Meta told El Reg it'll be a "mix of commercially available GPUs and our in-house silicon," aka MTIA.
While many AI devs, like xAI, have favored Nvidia hardware, Meta hasn't been shy about shopping around or building its own. Meta has previously deployed large quantities of Nvidia's Hopper and AMD's Instinct MI300-series GPUs to train and serve its models. It's also developed its own silicon to power its recommender models, but has yet to announce custom parts aimed at LLMs or diffusion models.
Along with training and running new models like Llama 4, some of these GPUs will eventually power an AI engineer, which Zuckerberg insists "will start contributing increasing amounts of code to our R&D efforts."
All the GPUs in the world won't do Meta any good if it doesn't have a place to put them. To power the accelerators, Meta began construction last month on a two gigawatt-plus datacenter in Richland Parish, Louisiana. The facility is so large, Zuckerberg boasted "it would cover a significant part of Manhattan," and shared an image of the facility overlaid over the island.
- OpenAI wants to blow through $500B on AI infrastructure for itself, with help from pals
- Trump hits undo on Biden AI safety order, EV mandate, emissions standards, and more
- In farewell speech, Biden rails against the tech industrial complex, disinfo dismantling democracy
- Musk torches $500B Stargate AI plan, Altman strikes back
The $10 billion facility, announced early last month, will cover four million square feet, and it won't be completed this year. The facility will instead be built in phases with construction continuing through 2030.
While Meta is looking for a nuclear-power provider to keep the GPUs running, this site will instead be powered by combined-cycle combustion turbine plants with a total energy generation capacity of 2,262 megawatts.
Whether investors are happy about it or not, it seems Zuckerberg is prepared to keep his AI shopping spree going for a while longer. "We have the capital to continue investing in the years ahead," he wrote. "This is a massive effort, and over the coming years it will drive our core products and business, unlock historic innovation, and extend American technology leadership." ®