Huawei claims it’s halved the time needed to build a 1,000-rack datacenter

Promises modular kit gets you up and running in six to nine months, with AI-powered ops to make it efficient


Huawei has entered the datacenter construction business with an offering that it claims can be built in half the time required by competing methods, then run more efficiently.

The prosaically named “Next-Generation Datacenter Facility”, as depicted in a video posted to Chinese social media, employs suspiciously-shipping-container-sized modules stacked into a larger building.

In the video, a pre-school girl and her father use Lego to assemble a cube-shaped building. The scene cuts to film of a very similar building under construction in the real world, before the director makes sure the metaphor can’t be missed by morphing the Lego and actual buildings, as depicted below.

Huawei Next-Generation Data Center Facility

Click to enlarge

But Huawei’s banking on speed and low cost, not cinematic subtlety. The Chinese giant asserts that its approach to big barn building means a thousand-rack facility can be up and running within six to nine months, compared to 18 months for bespoke jobs.

The company’s also created a power supply it asserts is simpler than those offered by rivals, as it can be delivered in two weeks instead of two months.

“Simplified cooling maximizes heat exchange efficiency by changing multiple heat exchanges to one heat exchange, and shortening the cooling link,” Huawei’s enthused.

The Next-Generation Datacenter Facility also includes Huawei’s very own automation and AI-infused optimization tools, said to be capable of fine-tuning cooling to make it more efficient. Power is a major cost for datacenter operations, so if Huawei has done this well it will be appreciated.

The Chinese mega-corp has not said if the datacenters require the presence of Huawei products to realize the promised benefits.

China has introduced a plan to relocate five million datacenter racks from the crowded east of the country to ten designated datacenter precincts in nation’s western sparsely-populated west. That’s one obvious market where rapid datacenter builds will be in demand. If Huawei wins even 20 percent of that business, it’ll be building 1,000 of these datacenters!

Quick builds will also be appreciated elsewhere around the world. Whether Huawei’s design is welcomed by governments remains to be seen. The company’s well documented troubles have centered on its role as a network equipment provider representing a risk of network disposition information, or actual data, reaching Beijing. The contents of a datacenter also represent very useful information to a nation credibly accused of using state-backed groups to conduct offensive cyber-ops.

Huawei denies it would ever act against clients’ interests, or in ways that violate local laws, or would willingly be used an agent of China’s intelligence services.

Modular datacenters based on shipping containers are not new, and one facility that used such a design – OVH’s Strasbourg facility – infamously went up in flames.

Huawei’s stated that this design is resilient and environmentally responsible, in addition to offering rapid builds. ®


Other stories you might like

  • Train once, run anywhere, almost: Qualcomm's drive to bring AI to its phone, PC chips
    Software toolkit offered to save developers time, effort, battery power

    Qualcomm knows that if it wants developers to build and optimize AI applications across its portfolio of silicon, the Snapdragon giant needs to make the experience simpler and, ideally, better than what its rivals have been cooking up in the software stack department.

    That's why on Wednesday the fabless chip designer introduced what it's calling the Qualcomm AI Stack, which aims to, among other things, let developers take AI models they've developed for one device type, let's say smartphones, and easily adapt them for another, like PCs. This stack is only for devices powered by Qualcomm's system-on-chips, be they in laptops, cellphones, car entertainment, or something else.

    While Qualcomm is best known for its mobile Arm-based Snapdragon chips that power many Android phones, the chip house is hoping to grow into other markets, such as personal computers, the Internet of Things, and automotive. This expansion means Qualcomm is competing with the likes of Apple, Intel, Nvidia, AMD, and others, on a much larger battlefield.

    Continue reading
  • Will cloud giants really drive colos off a financial cliff?
    The dude who predicted the Enron collapse bets they will

    Analysis Jim Chanos, the infamous short-seller who predicted Enron's downfall, has said he plans to short datacenter real-estate investment trusts (REIT).

    "This is our big short right now," Chanos told the Financial Times. "The story is that, although the cloud is growing, the cloud is their enemy, not their business. Value is accrued to the cloud companies, not the bricks-and-mortar legacy datacenters."

    However, Chanos's premise that these datacenter REITs are overvalued and at risk of being eaten alive by their biggest customers appears to overlook several important factors. For one, we're coming out of a pandemic-fueled supply chain crisis in which customers were willing to pay just about anything to get the gear they needed, even if it meant waiting six months to a year to get it.

    Continue reading
  • US expands efforts to hamstring China’s chipmaking mojo
    Beijing can't get next-gen lithography gear, America now trying to block sales of older machines

    The US government is reportedly stepping up efforts to hamper China's ability to grow its semiconductor manufacturing capabilities by pressing for a wider ban on key chipmaking gear.

    Uncle Sam hopes to convince officials in the Netherlands to block Dutch-native semiconductor equipment maker ASML from selling its older deep ultraviolet lithography (DUV) systems to China, according to a Tuesday report from Bloomberg that cited unnamed sources. US and Dutch officials declined to comment on the report, as did ASML.

    DUV systems use a less advanced lithography process than ASML's extreme ultraviolet light (EUV) machines that chipmakers are increasingly turning to for leading-edge components coming to the market, such as Apple's homegrown M2 silicon for Macs or Nvidia's H100 datacenter GPU.

    Continue reading
  • Cerebras sets record for 'largest AI model' on a single chip
    Plus: Yandex releases 100-billion-parameter language model for free, and more

    In brief US hardware startup Cerebras claims to have trained the largest AI model on a single device powered by the world's largest Wafer Scale Engine 2 chip the size of a plate.

    "Using the Cerebras Software Platform (CSoft), our customers can easily train state-of-the-art GPT language models (such as GPT-3 and GPT-J) with up to 20 billion parameters on a single CS-2 system," the company claimed this week. "Running on a single CS-2, these models take minutes to set up and users can quickly move between models with just a few keystrokes."

    The CS-2 packs a whopping 850,000 cores, and has 40GB of on-chip memory capable of reaching 20 PB/sec memory bandwidth. The specs on other types of AI accelerators and GPUs pale in comparison, meaning machine learning engineers have to train huge AI models with billions of parameters across more servers.

    Continue reading

Biting the hand that feeds IT © 1998–2022