Interview: AARNet's Peter Elford on Australia's national research infrastructure

What bits of the boffins' cloud do we build ourselves?

Australia is re-crafting the roadmap that guides its national research infrastructure, a task that covers everything from the network to the nation's high performance computing systems.

The roadmap from the Office of the Chief Scientist was offered for public comment late last year. Guided by the hand of chief scientist Alan Finkel, the aim is to work out where the national research infrastructure is lacking or in danger of falling behind.

The Department of Education next gets to digest submissions to its recommendations (PDF) (see below) to plug the gaps the community identified in the country's research infrastructure.

The AARNet (Australia's Academic and Research Network) submission came to our attention early this month when the national research network published it here (PDF).

Its main recommendations are better definition for the Australian Research Data Cloud; AARNet network expansion to better support researchers; and the priority research areas need to be better defined to ensure their outcomes.

National infrastructure recommendations

1. The roadmap says the digital and e-research platforms need to serve focus areas identified as platforms for humanities, arts and social sciences; characterisation; advanced fabrication and manufacturing; astronomy and advanced physics; environmental systems; biosecurity; complex biology; and therapeutic development.

2. There needs to be a Research Infrastructure National Advisory Group.

3. A research investment plan is needed.

4. Complementary initiatives such as the Medical Research Future Fund and the Biomedical Translation Fund need to be served by the national research infrastructure.

5. Skills development is, obviously, a requirement.

6. Existing national facilities (for example the RV Investigator arctic research vessel) shouldn't lose funds to pay for the national research infrastructure (the roadmap is less blunt, asking that they get “ongoing investment”).

7. International engagement should be coordinated – for example, regarding access to international facilities.

8. Awareness-raising is needed to promote researcher engagement with the facilities available to them.

9. High-performance computing (HPC) needs money and a review of governance arrangements.

To get a better understanding of AARNET's position, Vulture South spoke to AARNet's Peter Elford, Director, Government Relations and eResearch.

The piece of the roadmap that AARNet finds most interesting, Elford said, is the future of the Australian Research Data Cloud, envisaged in the roadmap as something to consolidate past work in e-research clouds into a single infrastructure.

"What's in it? What's not? What services does it provide? How do we make it sustainable?" are questions that aren't easy to answer, he said.

Vulture South asked whether this was an appropriate project to approach from the point of view of starting with a "minimum viable product."

The problem with that approach, he said, is adhering to it: "You start saying you're only going to build the minimum viable platform, and pretty soon you're trying to solve world hunger."

He added that it's important to consider an Australian Research Data Cloud as a separate question to the country's high performance computing facilities, as important as they are.

"The number of users served [by HPC] is very small – there are 50,000 researchers in Australia, and a couple of hundred of those are users of HPC. And HPC is a well-understood problem: compute and storage, all locally connected, wrapped in the skills of requirements specialists."

Other researchers, he said, still need to share, re-use, collect and collaborate with their data, but what they need is scalability on a cloud of commodity computers.

"The long-tail of research has large data sets. They need access to commodity compute, and access to tools and resources ... even in disciplines that aren't seen as data intensive."

The work already done by the Australian National Data Service (ANDS), the shared-storage Research Data Services (RDS), and the virtual infrastructure and virtual laboratories of National eResearch Collaboration Tools and Resources (NeCTAR) provide a good starting point, he said, but the ARDS "needs to be more than just those – it needs to be us, the Australian Access Federation, and major institutions," working together.

Moreover, he said, there needs to be more community work, so researchers are satisfied with the cloud they're offered.

Who builds what?

Calling for community involvement opens up what's traditionally a thorny topic in the research sector: what do you build versus what do you buy?

It's a "very tricky" conversation that challenges cultures, expectations, and historical funding arrangements skewed toward "you don't buy a service, you build infrastructure."

The research sector now recognises the importance of weaving commercial services into what it builds, "a cohesive, integrated combination of services ... the question is how do we get there?"

Elford notes that AARNet offers its own experience of integrating commercial services with its own build as a possible model to help the Australian Research Data Cloud fly.

"There are some things that, like Excel, just come pre-built. Others you have to build – for a genome sequencer, there's a layer of expertise you need."


As for HPC facilities, Elford said, everybody agrees: the country's supercomputing infrastructure needs substantial capital investment to bring it up to date.

Of the two peak national facilities – the Pawsey Supercomputing Centre in Perth and the National Computing Infrastructure at the Australian National University – the NCI's is ageing, short of capacity, and needs re-investment.

"Its model requires substantial federal investment every four to five years," he noted.

Governance is also under heavy discussion in the research community, with the possibility that the two facilities could come under some kind of shared governance model or even a single board.

Beneath lies the network

A surprise from AARNet is that in spite of being the body that first brought the Internet to Australia, and operating a continent-spanning and globally connected fibre network, it still sees gaps in university connectivity in Australia.

"Not all research facilities are well-connected: there are some institutes not connected to AARNet, and some campuses that don't have connections that support higher education needs," Elford said.

Primarily, he explained, connectivity challenges arise because of the inevitable expansion of institutions as they open new campuses outside the existing AARNet footprint and need network extensions.

"Someone needs to make the investments to expand the networks to new locations: either the institutions themselves, or through Commonwealth investment."

Whether it's new university campuses at Taree (both the University of New England and the University of Newcastle had their eye on the town, so they co-funded connectivity), or the very remote Square Kilometre Array (320km from the Western Australian city of Geraldton), the sector needs to work out how to fund new locations.

"International connectivity is one of the other gaps we identified," Elford said.

While AARNet has 100Gbps of capacity to the United States on the Southern Cross Cable, "capacity to Asia is much less than it needs to be, so we're looking at investment there."

Instead of buying 10Gbps services to Asia, the research sector would prefer to take whole wavelengths, but "not everybody is building subsea cables" that align with AARNet's needs. Its targets include Japan, China, the Philippines, and Singapore. ®

Other stories you might like

  • US won’t prosecute ‘good faith’ security researchers under CFAA
    Well, that clears things up? Maybe not.

    The US Justice Department has directed prosecutors not to charge "good-faith security researchers" with violating the Computer Fraud and Abuse Act (CFAA) if their reasons for hacking are ethical — things like bug hunting, responsible vulnerability disclosure, or above-board penetration testing.

    Good-faith, according to the policy [PDF], means using a computer "solely for purposes of good-faith testing, investigation, and/or correction of a security flaw or vulnerability."

    Additionally, this activity must be "carried out in a manner designed to avoid any harm to individuals or the public, and where the information derived from the activity is used primarily to promote the security or safety of the class of devices, machines, or online services to which the accessed computer belongs, or those who use such devices, machines, or online services."

    Continue reading
  • Intel plans immersion lab to chill its power-hungry chips
    AI chips are sucking down 600W+ and the solution could be to drown them.

    Intel this week unveiled a $700 million sustainability initiative to try innovative liquid and immersion cooling technologies to the datacenter.

    The project will see Intel construct a 200,000-square-foot "mega lab" approximately 20 miles west of Portland at its Hillsboro campus, where the chipmaker will qualify, test, and demo its expansive — and power hungry — datacenter portfolio using a variety of cooling tech.

    Alongside the lab, the x86 giant unveiled an open reference design for immersion cooling systems for its chips that is being developed by Intel Taiwan. The chip giant is hoping to bring other Taiwanese manufacturers into the fold and it'll then be rolled out globally.

    Continue reading
  • US recovers a record $15m from the 3ve ad-fraud crew
    Swiss banks cough up around half of the proceeds of crime

    The US government has recovered over $15 million in proceeds from the 3ve digital advertising fraud operation that cost businesses more than $29 million for ads that were never viewed.

    "This forfeiture is the largest international cybercrime recovery in the history of the Eastern District of New York," US Attorney Breon Peace said in a statement

    The action, Peace added, "sends a powerful message to those involved in cyber fraud that there are no boundaries to prosecuting these bad actors and locating their ill-gotten assets wherever they are in the world."

    Continue reading

Biting the hand that feeds IT © 1998–2022