Google imagines out of this world AI - running on orbital datacenters

Chocolate Factory's latest moonshot aims to put AI supercomputing cluster in sun-sychronous orbit

Google on Tuesday announced a new moonshot – launching constellations of solar-powered satellites packed to the gills with its home-grown tensor processing units (TPUs) to form orbital AI datacenters.

"In the future, space may be the best place to scale AI compute," Google executives wrote in a blog post, in which they explain that solar panels can be eight times more efficient in space than on Earth, and can produce power continuously.

Availability of energy has become a limiting factor for terrestrial datacenter builds, so the prospect of an abundant source of clean uninterrupted energy is no doubt quite attractive.

However, as Google points out, realizing this plan requires it to overcome significant challenges, one of which is finding enough rockets to place a useful amount of infrastructure in orbit. Despite SpaceX being on track to conduct over 140 launches this year, launch capacity is not easy to come by.

Launch cost is another consideration. Assuming that launch prices fall to $200 per kilogram by the mid 2030s, Google contends that space-based datacenters will be roughly comparable to their terrestrial equivalents in terms of energy costs. Current launch prices are more than ten times Google’s desired cost.

AI infrastructure also needs fast and resilient networking, and vendors today provide that with optic fibers and/or copper wires. For Project Suncatcher, Google will instead transmit data wirelessly through the near vacuum of Earth orbit.

By deploying spatial multiplexing, a technique for increasing throughput using multiple independent data streams, Google expects it will be able to connect large numbers of satellites together at "tens of terabits a second."

Google says it's already shown that the technique can be effective using 800 Gbps optics, but admits those systems required a lot of energy.

"Achieving this kind of bandwidth requires received power levels thousands of times higher than typical in conventional, long-range deployments," Google explained. "Since received power scales inversely with the square of the distance, we can overcome this challenge by flying the satellites in a very close formation – kilometers or less."

In other words, these compute constellations will need to be quite dense. In one simulation, Google suggests a cluster of 81 satellites that fly between 100-200 meters from one another in an arrangement two kilometers across and at an altitude of 650 kilometers.

If Google manages to get a sufficiently large fleet of TPUs into orbit and can make the space-based communications network reliable, the equipment still needs to survive the unforgiving environment of space.

The source of energy that makes the idea of a space-based AI supercomputer so attractive also happens to spew out ionizing radiation, which isn't exactly great for electronics. On Earth, we're shielded from much of this radiation by the planet's electromagnetic field and thick atmosphere. However, in orbit, those protections are not nearly as strong.

Google is already investigating radiation hardened versions of its TPUs. And, as it turns out, the company may not need to do much to keep them alive. In testing, Google exposed its TPU v6e (codenamed Trillium) accelerators to a 67 megaelectron-volt photon beam to see how it would cope with radiation.

The results showed that the most sensitive piece of the accelerator was its high-bandwidth memory modules, which began showing irregularities after a cumulative dose of 2 krad(Si), which is almost 3 times what the chip would be expected to endure in a shielded environment over its five year mission.

Google conducted system tests on Earth and documented those efforts in this prepublication paper [PDF]. The company plans to launch a pair of prototype satellites in 2027 to further evaluate its hardware and the feasibility of orbiting datacenters.

Google isn't the first to suggest space as the next datacenter frontier.

We reported on a startup that hoped to do it back in 2017 but that effort never made it into orbit.

Hewlett Packard Enterprise (HPE) has been working on spacefaring compute platforms for years now. Its first unit, Spaceborne, launched in 2017 and spent nearly two years aboard the ISS, but experienced failures in one of its four redundant PSUs, and nine out of its 20 SSDs. Axiom Space also launched a similar compute prototype to the ISS in August.

Just last month Amazon founder and executive chair Jeff Bezos predicted that, within the next two decades, gigawatt-scale datacenters will fill the skies powered by a limitless stream of photons from the sun. And last Saturday, Elon Musk said SpaceX will build orbiting datacenters. ®

More about

TIP US OFF

Send us news


Other stories you might like