Academics tell Brit MPs to check the software used when considering reproducibility in science and tech research

It's seldom subject to the same rigour as conventional apparatus

Brit MPs are being encouraged to pay attention to the role software plays as they prepare a report on reproducibility in the science and technology industry, which adds around £36bn to the economy.

According to joint academics group the Software Sustainability Institute, about 69 per cent of research is produced with specialist software, which could be anything from short scripts to solve a specific problem, to complex spreadsheets analysing collected data, to the millions of lines of code behind the Large Hadron Collider and the Square Kilometre Array.

"With many studies, research published without the underlying software used to produce the results is unverifiable," the institute said in its submission to the Parliamentary Science and Technology Committee's Reproducibility and Research Integrity Inquiry.

The institute said a 2014 workshop had revealed that research software "is infrequently subjected to the same scientific rigour as is applied to more conventional experimental apparatus."

The committee has just published 86 similar pieces of evidence addressing this broad and difficult problem.

The reproducibility crisis came to the fore in 2005 when Stanford School of Medicine professor John Ioannidis published a paper titled: "Why most published research findings are false."

Since then, the issue has arisen in a number of studies demonstrating the prevalence of irreproducible data, the committee said. UK Research and Innovation (UKRI), the non-departmental public body funded through the Department for Business, Energy and Industrial Strategy, is in the process of establishing a national research integrity committee.

However, the specific issue of reproducible research has thus far been overlooked, the Science and Technology Committee said.

Others question whether "crisis" is the right word. King's College London said in its submission: "We believe all stakeholders have a responsibility to use sober and accurate language to discuss these challenges and question whether 'crisis' risks overgeneralising a complex set of issues and may allow for misrepresentation in the media which has the potential to needlessly damage public trust.

"Referring to irreproducible research findings as a 'crisis' may also imply that this is an acute issue that can be swiftly resolved, rather than a characteristic embedded in our current research culture."

Crisis or not, the problem has many different strands, from medicine to neuropsychology, and even to research into battery performance, according to the submissions.

None of this should prevent the underlying role of software from being ignored, the Software Sustainability Institute argued.

"To enable systemic change to improve reproducibility and research integrity, the quality and transparency of software must be improved," it said in its submission.

"Research software engineers have a key role to play by making the software used in research more robust and reusable, and helping train researchers in the fundamentals of publishing code so others can review and inspect it."

According to its website, the Software Sustainability Institute has facilitated the advancement of software in research by cultivating better, more sustainable research software to enable world-class research since 2010. It was previously awarded funding from all seven research councils and its mission is to become the world-leading hub for research software practice.

The Science and Technology Committee will hold oral sessions to take evidence from December and a report will follow. Then we'll see if software is given due consideration. ®

Similar topics

Other stories you might like

  • It's 2022 and there are still malware-laden PDFs in emails exploiting bugs from 2017
    Crafty file names, encrypted malicious code, Office flaws – ah, it's like the Before Times

    HP's cybersecurity folks have uncovered an email campaign that ticks all the boxes: messages with a PDF attached that embeds a Word document that upon opening infects the victim's Windows PC with malware by exploiting a four-year-old code-execution vulnerability in Microsoft Office.

    Booby-trapping a PDF with a malicious Word document goes against the norm of the past 10 years, according to the HP Wolf Security researchers. For a decade, miscreants have preferred Office file formats, such as Word and Excel, to deliver malicious code rather than PDFs, as users are more used to getting and opening .docx and .xlsx files. About 45 percent of malware stopped by HP's threat intelligence team in the first quarter of the year leveraged Office formats.

    "The reasons are clear: users are familiar with these file types, the applications used to open them are ubiquitous, and they are suited to social engineering lures," Patrick Schläpfer, malware analyst at HP, explained in a write-up, adding that in this latest campaign, "the malware arrived in a PDF document – a format attackers less commonly use to infect PCs."

    Continue reading
  • New audio server Pipewire coming to next version of Ubuntu
    What does that mean? Better latency and a replacement for PulseAudio

    The next release of Ubuntu, version 22.10 and codenamed Kinetic Kudu, will switch audio servers to the relatively new PipeWire.

    Don't panic. As J M Barrie said: "All of this has happened before, and it will all happen again." Fedora switched to PipeWire in version 34, over a year ago now. Users who aren't pro-level creators or editors of sound and music on Ubuntu may not notice the planned change.

    Currently, most editions of Ubuntu use the PulseAudio server, which it adopted in version 8.04 Hardy Heron, the company's second LTS release. (The Ubuntu Studio edition uses JACK instead.) Fedora 8 also switched to PulseAudio. Before PulseAudio became the standard, many distros used ESD, the Enlightened Sound Daemon, which came out of the Enlightenment project, best known for its desktop.

    Continue reading
  • VMware claims 'bare-metal' performance on virtualized GPUs
    Is... is that why Broadcom wants to buy it?

    The future of high-performance computing will be virtualized, VMware's Uday Kurkure has told The Register.

    Kurkure, the lead engineer for VMware's performance engineering team, has spent the past five years working on ways to virtualize machine-learning workloads running on accelerators. Earlier this month his team reported "near or better than bare-metal performance" for Bidirectional Encoder Representations from Transformers (BERT) and Mask R-CNN — two popular machine-learning workloads — running on virtualized GPUs (vGPU) connected using Nvidia's NVLink interconnect.

    NVLink enables compute and memory resources to be shared across up to four GPUs over a high-bandwidth mesh fabric operating at 6.25GB/s per lane compared to PCIe 4.0's 2.5GB/s. The interconnect enabled Kurkure's team to pool 160GB of GPU memory from the Dell PowerEdge system's four 40GB Nvidia A100 SXM GPUs.

    Continue reading
  • Nvidia promises annual updates across CPU, GPU, and DPU lines
    Arm one year, x86 the next, and always faster than a certain chip shop that still can't ship even one standalone GPU

    Computex Nvidia's push deeper into enterprise computing will see its practice of introducing a new GPU architecture every two years brought to its CPUs and data processing units (DPUs, aka SmartNICs).

    Speaking on the company's pre-recorded keynote released to coincide with the Computex exhibition in Taiwan this week, senior vice president for hardware engineering Brian Kelleher spoke of the company's "reputation for unmatched execution on silicon." That's language that needs to be considered in the context of Intel, an Nvidia rival, again delaying a planned entry to the discrete GPU market.

    "We will extend our execution excellence and give each of our chip architectures a two-year rhythm," Kelleher added.

    Continue reading
  • Amazon puts 'creepy' AI cameras in UK delivery vans
    Big Bezos is watching you

    Amazon is reportedly installing AI-powered cameras in delivery vans to keep tabs on its drivers in the UK.

    The technology was first deployed, with numerous errors that reportedly denied drivers' bonuses after malfunctions, in the US. Last year, the internet giant produced a corporate video detailing how the cameras monitor drivers' driving behavior for safety reasons. The same system is now apparently being rolled out to vehicles in the UK. 

    Multiple camera lenses are placed under the front mirror. One is directed at the person behind the wheel, one is facing the road, and two are located on either side to provide a wider view. The cameras are monitored by software built by Netradyne, a computer-vision startup focused on driver safety. This code uses machine-learning algorithms to figure out what's going on in and around the vehicle.

    Continue reading

Biting the hand that feeds IT © 1998–2022