HPC

US to fork $5bn+ into exascale supers

Get with it, Europe


With the supercomputing community in the Western economies freaking out just a little bit that China has come out of nowhere to take the lead in supercomputing, and the US supposedly getting ready to allocate $5bn in an effort to push up into the exacale realm, IDC could not find a better week to deliver its report to the European Commission about how it needs to spend lots of money to remain a player in the HPC arms race to exaflops-level sustained performance.

In February, as El Reg reported, the HPC analysts at IDC were tapped by the European Commission to work with some of the biggest supercomputing labs in Europe to rationalize and coordinate the move from petascale to exascale computing. IDC was commissioned by the Commission to work with Teratec in France, Daresbury Laboratory in the United Kingdom. and Leibniz-Rechenzentrum and Forschungszentrum Jülich in Germany to create a "strategic agenda" for HPC projects in Europe.

The 94-page report states all of the obvious facts about how supercomputers are important to defense, meteorology, medicine, manufacturing, and other endeavors, and that the European collective cannot afford to fall behind the United States, Japan, or China when it comes to developing ever-faster supercomputers and, more importantly, the applications that make as much use of the hardware as possible to solve real problems, and not just do matrix math in Fortran for bragging rights every six months.

The interesting tidbit buried deep in the report (on page 89) is that IDC believe that the United States Department of Energy, which shells out the big bucks for HPC in that country, is getting ready to ask Congress for in excess of $5bn (€3.75bn) "to develop multiple exascale computers." The Defense Advanced Research Projects Agency is also funding its Omnipresent High Performance Computing program, which hopes to get a prototype exascale machine into the field by 2018.

The IDC report notes that Intel is working with five universities in Belgium to create the Flanders ExaScale Lab, and Intel is also working with French partners at the Exatec HPC Lab to focus on exascale hardware and software design. HP Labs is dabbling in exascale issues, and Cray is collaborating with a bunch of labs and universities with its own Exascale Research Initiative. IBM is working with FZJ in Germany to get an exascale machine up and running by 2020.

None of this, you'll note, sounds particularly European. Sure, as the study notes, there are plenty of HPC users in Europe, which still has a manufacturing and research base that is the envy of the world in many respects, but there are no big indigenous HPC suppliers.

There are some fine HPC companies in Europe — a shout out to Bright Computing's and Adaptive Compting's cluster management tools, to Allinea Software's HPC development tools, to T-Platforms' very slick CPU-GPU blade servers, and to Bull's hilariously named Bullx Xeon 7500 blades that are, despite the name, quite capable.

But there is not, as IDC correctly points out, an indigenous supplier or consortium of suppliers making a truly European exascale system. And that is embarrassing.

Nowhere in the report is there a mention that this might be something that is useful. So I will say again what I said in February: what the European Commission needs to do is fund innovation in HPC, and help foster homegrown products that make European companies money as other companies spend money on HPC.

That might mean, for instance, creating a consortium that puts Ubuntu Linux on ARM processors that are goosed with GPUs or other accelerators (Clearspeed, perhaps?) that remove some of the overhead of the x64 and RISC architectures that are still general-purpose computers being asked to do very specific and unnatural rates of calculation. GlobalFoundries' wafer bakers in Dresden could use some work stamping these floppy ARM chips.

Exascale might mean a lot of things, but it probably does not mean putting 1 billion Xeon processor cores in a system that eats 300 megawatts. There needs to be some true innovation here, and Europe has a real chance of building an HPC business for its own economies as well as building HPC systems for governments to play with.

That is what China is up to. And, for that matter, that is what the US is doing when it gives Cray and IBM so much money for petaflops supers these days.

If Europe wanted to have a lead in the HPC race, maybe it should have stayed in the server racket? The funny bit is that the exascale problem gives European entrepreneurs a chance to get back into the game and to help foster real innovation.

This isn't just about cutting checks to IBM, Intel, and Cray. Or, given the political and economic nature of supercomputing, it shouldn't be to avoid being boring. Thank heavens for China, that's what I say. Wake up and innovate. ®

Broader topics


Other stories you might like

  • US weather forecasters triple supercomputing oomph with latest machines
    NOAA makes it rain for General Dynamics IT, HPE, AMD

    Predicting the weather is a notoriously tricky enterprise, but that’s never held back America's National Oceanic and Atmospheric Administration (NOAA).

    After more than two years of development, the agency brought a pair of supercomputers online this week that it says are three times as powerful as the machines they replace, enabling more accurate forecast models.

    Developed and maintained by General Dynamics Information Technology under an eight-year contract, the Cactus and Dogwood supers — named after the fauna native to the machines' homes in Phoenix, Arizona, and Manassas, Virginia, respectively — will support larger, higher-resolution models than previously possible.

    Continue reading
  • Big Tech falls in line with Euro demands to fight bots, deepfakes, disinformation
    Six percent of revenues at risk if Code of Practice broken

    Meta, Twitter, Google, Microsoft and other tech companies and publishers have agreed to fight disinformation online in accordance with the European Commission's latest Code of Practice rules, which were published on Thursday.

    The code [PDF] lists a broad set of commitments that signatories can choose to adhere to in the fight against digital fakery. Among the options are taking steps to demonetize disinformation; businesses should avoid placing ads next to fake news or profiting off the spread of false information online; and clearly labeling political advertisements. 

    Other concerns include making data from social media platforms more transparent and available for researchers and supporting the work of fact checkers. The EU updated these guidelines to tackle the rise of fake bots accounts and AI-generated deepfakes too. Signatories promise to outline their internal policies for dealing with manipulated content, and have to show their algorithms used for detecting and moderating deepfakes are trustworthy. 

    Continue reading
  • Qualcomm wins EU court battle against $1b antitrust fine
    Another setback for competition watchdog as ruling over exclusive chip deal with iPhone nullified

    The European Commission's competition enforcer is being handed another defeat, with the EU General Court nullifying a $1.04 billion (€997 million) antitrust fine against Qualcomm.

    The decision to reverse the fine is directed at the body's competition team, headed by Danish politico Margrethe Vestager, which the General Court said made "a number of procedural irregularities [which] affected Qualcomm's rights of defense and invalidate the Commission's analysis" of Qualcomm's conduct. 

    At issue in the original case was a series of payments Qualcomm made to Apple between 2011 and 2016, which the competition enforcer had claimed were made in order to guarantee the iPhone maker exclusively used Qualcomm chips.

    Continue reading

Biting the hand that feeds IT © 1998–2022