FTC Tech Summit highlights GPU shortages, concentrations of power
Startups hoping to compete in the industry face an uphill battle
Panel The dominance of big tech in cloud computing, coupled with a shortage of chips, is preventing smaller AI software and hardware startups from competing fairly, according to panelists at the FTC Tech Summit this week.
The conversation was set against the backdrop of the announcement by the US Federal Trade Commission that they were launching an inquiry investigating the major players: Amazon, Microsoft, Google and their partnerships with top large language models developers: Anthropic and OpenAI.
Amazon and Google have invested $6 billion in total into Anthropic, whilst Microsoft has pledged over $10 billion for an exclusive relationship with OpenAI so far. In return, the cloud giants get access to the latest generative AI models built by Anthropic and OpenAI, while they both gain computing resources.
These alliances allow all parties to compete with one another, but potentially exclude everyone else, the FTC says.
Under Chair Lina Khan, the commission is now scrutinizing their partnerships in more detail, and has asked to take a look at the agreements, product release strategies, and impacts on the AI ecosystem. The major trio control an estimated 66 percent of the cloud computing market, and have sway over who gets GPUs to train and run models.
Since these chips are scarce, they might be inclined to give them to their partners, which undermines competition amongst AI developers. Other startups trying to build large language models as capable as Anthropic's Claude or OpenAI's ChatGPT could be left struggling to secure the resources needed.
"We face basic questions of power and governance," Khan said in the opening remarks for the FTC's Tech Summit this week. "Will this be a moment of opening up markets to fair and free competition, unleashing the full potential of emerging technologies? Or will a handful of dominant firms concentrate control over these tools, locking us into a future of their choosing?," she asked.
- FTC drills into Amazon, Microsoft, Google over billions pledged to OpenAI, Anthropic
- Microsoft hits $3 trillion as investors drink AI Kool-Aid
- Datacenters could account for a third of Ireland's electricity by 2026
- US govt, tech sector team up to get academia making its own next-gen AI models
The playing field is unequal due to the limited number of GPUs. But the problem goes deeper and boils down to hardware suppliers and manufacturers, according to experts speaking at the summit's panel discussion on AI, chips and cloud infrastructure.
"I think the biggest challenge that we're seeing is that all roads lead to Nvidia. They are a bottleneck on all of this, followed only slightly by the large cloud providers that are their primary customers," Corey Quinn, the Chief Cloud Economist at The Duckbill Group, who helps companies manage their AWS bills.
As the top provider of GPUs, Nvidia has benefited handsomely from the AI hype. As of this month, its market cap has reached $1.53 trillion and is expected to grow. Nvidia dominates negotiations with cloud providers, choosing how many chips to sell to each one and for how much. Meanwhile, rivals, who have built their own AI accelerators, haven't gained much traction in the cloud market.
Daven Rauchwerk, an entrepreneur, who has founded hardware businesses, said the lack of choice hampers competition amongst semiconductor companies and affects the cloud industry too. Investors are less likely to risk losing money backing startups going up against Nvidia, Amazon, Microsoft, and Google leading to less innovation.
"If you want to have more chip companies, you need more cloud companies. We have too few cloud companies. The large [ones] are not actually buying chips from [other] companies. If there's no market for these diverse sets of chips…well, why would a venture investor invest in a chip company?," he said.
The only vendors that can compete with Nvidia are the cloud providers themselves. Amazon, Google, and Microsoft have built custom AI accelerators for their own platforms, allowing them to control access to hardware and AI models. Panelists were concerned it could mean the trio may have more incentive to hike up the prices of their AI services.
One way to counteract this would be to make it easier for customers to switch providers, said Tania Van den Brande, director of economics at the UK's communications regulator, Ofcom.
"I think what's important here is that not only [will that] enable challengers, but it also makes sure that the cloud providers keep being incentivized to go after each other's customer bases. That might be less the case if once a customer has moved in, they're not more or less locked in," she concluded. ®