This article is more than 1 year old

Academics tell Brit MPs to check the software used when considering reproducibility in science and tech research

It's seldom subject to the same rigour as conventional apparatus

Brit MPs are being encouraged to pay attention to the role software plays as they prepare a report on reproducibility in the science and technology industry, which adds around £36bn to the economy.

According to joint academics group the Software Sustainability Institute, about 69 per cent of research is produced with specialist software, which could be anything from short scripts to solve a specific problem, to complex spreadsheets analysing collected data, to the millions of lines of code behind the Large Hadron Collider and the Square Kilometre Array.

"With many studies, research published without the underlying software used to produce the results is unverifiable," the institute said in its submission to the Parliamentary Science and Technology Committee's Reproducibility and Research Integrity Inquiry.

The institute said a 2014 workshop had revealed that research software "is infrequently subjected to the same scientific rigour as is applied to more conventional experimental apparatus."

The committee has just published 86 similar pieces of evidence addressing this broad and difficult problem.

The reproducibility crisis came to the fore in 2005 when Stanford School of Medicine professor John Ioannidis published a paper titled: "Why most published research findings are false."

Since then, the issue has arisen in a number of studies demonstrating the prevalence of irreproducible data, the committee said. UK Research and Innovation (UKRI), the non-departmental public body funded through the Department for Business, Energy and Industrial Strategy, is in the process of establishing a national research integrity committee.

However, the specific issue of reproducible research has thus far been overlooked, the Science and Technology Committee said.

Others question whether "crisis" is the right word. King's College London said in its submission: "We believe all stakeholders have a responsibility to use sober and accurate language to discuss these challenges and question whether 'crisis' risks overgeneralising a complex set of issues and may allow for misrepresentation in the media which has the potential to needlessly damage public trust.

"Referring to irreproducible research findings as a 'crisis' may also imply that this is an acute issue that can be swiftly resolved, rather than a characteristic embedded in our current research culture."

Crisis or not, the problem has many different strands, from medicine to neuropsychology, and even to research into battery performance, according to the submissions.

None of this should prevent the underlying role of software from being ignored, the Software Sustainability Institute argued.

"To enable systemic change to improve reproducibility and research integrity, the quality and transparency of software must be improved," it said in its submission.

"Research software engineers have a key role to play by making the software used in research more robust and reusable, and helping train researchers in the fundamentals of publishing code so others can review and inspect it."

According to its website, the Software Sustainability Institute has facilitated the advancement of software in research by cultivating better, more sustainable research software to enable world-class research since 2010. It was previously awarded funding from all seven research councils and its mission is to become the world-leading hub for research software practice.

The Science and Technology Committee will hold oral sessions to take evidence from December and a report will follow. Then we'll see if software is given due consideration. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like