HPC

TACC Frontera's 2022: Academic supercomputer to run intriguing experiments

Plus: Director reveals 10 million node hours, 50-70 million core hours went into COVID-19 research


The largest academic supercomputer in the world has a busy year ahead of it, with researchers from 45 institutions across 22 states being awarded time for its coming operational run.

Frontera, which resides at the University of Texas at Austin's Texas Advanced Computing Center (TACC), said it has allocated time for 58 experiments through its Large Resource Allocation Committee (LRAC), which handles the largest proposals. To qualify for an LRAC grant, proposals must be able to justify effective use of a minimum of 250,000 node hours and show that they wouldn't be able to do the research otherwise. 

Two additional grant types are available for smaller projects as well, but LRAC projects utilize the majority of Frontera's nodes: An estimated 83% of Frontera's 2022-23 workload will be LRAC projects. 

Frontera was built in 2019, and premiered on the Top 500 supercomputers list at number five. Since then, the CPU-based, 8,008-node supercomputer has slipped to 13th, but TACC still maintains that it's "the most powerful supercomputer in academia."

In addition to its 38.7 petaflop main system, Frontera also includes the Longhorn GPU-accelerated workload systems, the Frontera GPU system consisting of 90 immersion-cooled Nvidia GPU nodes and its Ranch tape storage system. 

Getting TACC back on track

2020 was a bad year for the world, and a particularly busy one for Frontera and the TACC, which was immediately diverted toward COVID-19 research that TACC director Dan Stanzione described as a "huge effort for us."

"Ten million node hours, and approximately 50-70 million core hours went into COVID-19," Stanzione said. TACC and Frontera worked on molecular dynamics, drug discovery, epidemiology, and genomics work related to the coronavirus. 

LRAC experiments are some of the biggest, most complicated tasks Frontera is capable of running, and the list of experiments scheduled for the coming year reflect that. 

As one example, researchers from Carnegie Mellon University and University of California Riverside plan to use Frontera to develop super-resolution simulations of galaxies and quasars that the team said are necessary to keep up with advancements in telescope technology. Specifically, the team wants to find a way to overcome simulation limitations that force researchers to maximize resolution or volume, but not both. 

Another experiment hopes to gather data on simulated flows of Mach-6 speeds against a surface at a 35-degree angle. This experiment aims to improve understanding of vehicles moving at hypersonic velocities

Supercomputing advances are being tested as well, with researchers from Binghamton University in New York planning to use Frontera to screen a large batch of potential chemicals for use as superconductors, a la drug discovery

Many of the remaining experiments [PDF] deal with astrophysics modeling, while others are looking in to supercell thunderstorms, elevation modeling, and quantum computing. ®

Broader topics


Other stories you might like

  • US won’t prosecute ‘good faith’ security researchers under CFAA
    Well, that clears things up? Maybe not.

    The US Justice Department has directed prosecutors not to charge "good-faith security researchers" with violating the Computer Fraud and Abuse Act (CFAA) if their reasons for hacking are ethical — things like bug hunting, responsible vulnerability disclosure, or above-board penetration testing.

    Good-faith, according to the policy [PDF], means using a computer "solely for purposes of good-faith testing, investigation, and/or correction of a security flaw or vulnerability."

    Additionally, this activity must be "carried out in a manner designed to avoid any harm to individuals or the public, and where the information derived from the activity is used primarily to promote the security or safety of the class of devices, machines, or online services to which the accessed computer belongs, or those who use such devices, machines, or online services."

    Continue reading
  • Intel plans immersion lab to chill its power-hungry chips
    AI chips are sucking down 600W+ and the solution could be to drown them.

    Intel this week unveiled a $700 million sustainability initiative to try innovative liquid and immersion cooling technologies to the datacenter.

    The project will see Intel construct a 200,000-square-foot "mega lab" approximately 20 miles west of Portland at its Hillsboro campus, where the chipmaker will qualify, test, and demo its expansive — and power hungry — datacenter portfolio using a variety of cooling tech.

    Alongside the lab, the x86 giant unveiled an open reference design for immersion cooling systems for its chips that is being developed by Intel Taiwan. The chip giant is hoping to bring other Taiwanese manufacturers into the fold and it'll then be rolled out globally.

    Continue reading
  • US recovers a record $15m from the 3ve ad-fraud crew
    Swiss banks cough up around half of the proceeds of crime

    The US government has recovered over $15 million in proceeds from the 3ve digital advertising fraud operation that cost businesses more than $29 million for ads that were never viewed.

    "This forfeiture is the largest international cybercrime recovery in the history of the Eastern District of New York," US Attorney Breon Peace said in a statement

    The action, Peace added, "sends a powerful message to those involved in cyber fraud that there are no boundaries to prosecuting these bad actors and locating their ill-gotten assets wherever they are in the world."

    Continue reading

Biting the hand that feeds IT © 1998–2022