This article is more than 1 year old

Google dishes out homemade SLSA, a recipe to thwart software supply-chain attacks

Try it with phish'n'chips

Google has proposed a framework called SLSA for dealing with supply chain attacks, a security risk exemplified by the recent compromise of the SolarWinds Orion IT monitoring platform.

SLSA – short for Supply chain Levels for Software Artifacts and pronounced "salsa" for those inclined to add convenience vowels – aspires to provide security guidance and programmatic assurance to help defend the software build and deployment process.

"The goal of SLSA is to improve the state of the industry, particularly open source, to defend against the most pressing integrity threats," said Kim Lewandowski, Google product manager, and Mark Lodato, Google software engineer, in a blog post on Wednesday. "With SLSA, consumers can make informed choices about the security posture of the software they consume."

Supply chain attacks – attempting to exploit weaknesses in the software creation and distribution pipeline – have surged recently. Beyond the SolarWinds incident and the exploitation of vulnerabilities in Apache Struts, there have been numerous attacks on software package registries like npm, PyPI, RubyGems, and Maven Central that house code libraries developers rely on to support complex applications.

According to security biz Sonatype [PDF], attacks on open source projects increased 430 per cent during 2020. One of the various plausible reasons is that compromising a dependency in a widely used library ensures broad distribution of malware. As noted in a 2019 TU Darmstadt research paper, the top five npm packages in 2018 "each reach between 134,774 and 166,086 other packages, making them an extremely attractive target for attackers."

Eating your own dog food

SLSA is based on Google's own internal security process, "Binary Authorization for Borg," used within the ad giant for more than eight years and currently obligatory for any production workload. It consists of standards (rules), accreditation (by which standards compliance can be established), and technical controls (signed metadata for automated policy frameworks).

Presently, SLSA is more or less useful security advice. But the hope is that it will become something more than that.

"In its final form, SLSA will differ from a list of best practices in its enforceability: it will support the automatic creation of auditable metadata that can be fed into policy engines to give 'SLSA certification' to a particular package or build platform," explain Lewandowski and Lodato.

Four levels of compliance are envisioned. The highest, SLSA 4, involves two people reviewing all changes and a hermetic, reproducible build process as a way to be reasonably certain that nothing has been tampered with.

The hope is that SLSA will help catch issues like hypocrite commits, compromised source control platforms, maliciously modified or compromised build infrastructure, subverted dependencies, dangerous build artifacts, hijacked repositories, and typosquatting attacks.

On a related security note, Google and ISRG, the non-profit behind Let's Encrypt and other helpful projects, said on Thursday that since April, they've been funding developer Miguel Ojeda to work on Rust for Linux and other security projects and plan to do so for a year. Adding more Rust code to the Linux kernel is expected to reduce memory safety errors.

"Since [the Linux kernel is] written largely in the C language, which is not memory-safe, memory safety vulnerabilities such as buffer overflows and use-after-frees are a constant concern," explained Josh Aas, ISRG executive director, in a blog post.

"By making it possible to write parts of the Linux kernel in Rust, which is memory-safe, we can entirely eliminate memory safety vulnerabilities from certain components, such as drivers." ®

More about

TIP US OFF

Send us news


Other stories you might like