This article is more than 1 year old

Google says open source software should be more secure

At the White House Open Source Summit, the Chocolate Factory floated a few ideas to make that happen

In conjunction with a White House meeting on Thursday at which technology companies discussed the security of open source software, Google proposed three initiatives to strengthen national cybersecurity.

The meeting was arranged last month by US national security adviser Jake Sullivan, amid the scramble to fix the Log4j vulnerabilities that occupied far too many people over the holidays. Sullivan asked invited firms – a group that included Amazon, Apple, Google, IBM, Microsoft, and Oracle – to share ideas on how the security of open source projects might be improved.

Google chief legal officer Kent Walker in a blog post said that just as the government and industry have worked to shore up shoddy legacy systems and software, the Log4j repair process – still ongoing – has demonstrated that open source software needs the same attention as critical infrastructure.

"For too long, the software community has taken comfort in the assumption that open source software is generally secure due to its transparency and the assumption that 'many eyes' were watching to detect and resolve problems," said Walker. "But in fact, while some projects do have many eyes on them, others have few or none at all."

Pointing out Google's various efforts to be part of the solution, he outlined several possible public-private partnerships that were mentioned at the meeting:

  • To identify a list of critical open source projects
  • To establish baseline standards for security, maintenance, provenance, and testing
  • To set up a maintenance marketplace, to match volunteers to needy projects

Laudable ideas all, if not particularly radical, unexpected, or novel.

Knowing which open source projects have the widest reach is certainly important to understanding where bugs would have the widest impact. Google software engineers have already been thinking about defining "criticality" in the context of software, so that work is underway. In fact, there's software to generate a criticality score for other software.

As for baseline standards, the Open Source Security Foundation is already on the case, and we already have frameworks like the Google-devised Supply chain Levels for Software Artifacts. So that too is a work in-progress.

Walker's description of an organization to connect projects with volunteer helpers employed at companies sounds a lot like any of the several open source sustainability efforts, just without the specific monetary component of GitHub Sponsors or Patreon.

"Many leading companies and organizations don’t recognize how many parts of their critical infrastructure depend on open source," said Walker. "That’s why it’s essential that we see more public and private investment in keeping that ecosystem healthy and secure."

That's what everyone keeps saying, though often without paying.

Power in a union

Mike Hanley, chief security officer at GitHub, also had something to say on the subject: "First, there must be a collective industry and community effort to secure the software supply chain," he said in a blog post. "Second, we need to better support open source maintainers to make it easier for them to secure their projects."

Katie Moussouris, founder of Luta Security, told The Register in a phone interview that Google, as part of what she described as the security one per cent, does a lot of good work on its own product security and on security related to the software ecosystem. But that work, she said, is purely voluntary.

"If the US government is concerned about securing open source, then it does need to get more serious in terms of providing support to the open source community that is not volunteer, charity work from the security one per cent like Google and Microsoft and other elite, large service providers that were invited to the White House today," she explained.

Moussouris suggested we need to adopt a model that's more like Universal Basic Income for the developer community, in part because it's a challenge to identify which projects are critical and which are not.

"The open source community definitely needs some form of universal basic income, because there are projects that start out as hobbies by one individual, and predicting popularity becomes a very difficult thing," she said.

These projects often exist without much attention until there's a security vulnerability and people realize there's only a single maintainer, she said. While the government should appreciate the contributions of large companies like Google and its peers, "it cannot rely on the volunteer charity, labor and donations of the security 1 per cent mega-corporations if it's going to solve this problem," she said.

Asked whether a software license that imposes financial support obligations on large users of open source projects might help, Moussouris wasn't certain licensing was the ideal approach to make open source more sustainable and more secure. But she voiced support for shifting revenue from the haves to the have-nots as a general goal.

"If the idea is to drive more of those who are profiting from open source and more of those profit dollars towards those who are building open source – as in the maintainers, and those who are doing it for free, or for very little financial support – if the goal is to drive more of those open source-derived profits back into the hands of the maintainers, I'm all for it," she said.

Moussouris added that getting money to open source maintainers can be complicated. It's often not easy to identify who to pay or how to pay them. "You can't just cut a check from the government to an individual person, and that's true around the world," she said.

Another issue not mentioned among Google's proposals is the need for specific security skills in the bug fixing process. Moussouris pointed to the lack of root cause analysis with Log4j that allowed multiple variants to be developed that bypassed the initial fix. The Log4j developers, she said, didn't understand the scope of the vulnerability that had been reported.

"That's the problem that's not gonna be solved by throwing more developers at [the problem] – these are different job roles," she explained. "So that is a gap in what everyone is talking about here in terms of support." ®

More about

TIP US OFF

Send us news


Other stories you might like