US cybersecurity chief: Software makers shouldn't lawyer their way out of security responsibilities
Who apart from Microsoft is happy with the ship now, oh just fix it later approach?
SCSW What's more dangerous than Chinese spy balloons? Unsafe software and other technology products, according to America's Cybersecurity and Infrastructure Agency (CISA) Director Jen Easterly.
During a speech at Carnegie Mellon University on Monday, Easterly said technology providers must prioritize security in their products over other incentives such as cost, features, and speed to market. And she suggested that the government hold companies liable for selling vulnerable products that criminals and nation states later exploit in cyberattacks.
"Government can work to advance legislation to prevent technology manufacturers from disclaiming liability by contract, establishing higher standards of care for software in specific critical infrastructure entities, and driving the development of a safe harbor framework to shield from liability companies that securely develop and maintain their software products and services," Easterly said.
"While it will not be possible to prevent all software vulnerabilities, the fact that we've accepted a monthly 'Patch Tuesday' as normal is further evidence of our willingness to operate dangerously at the accident boundary," she added.
The end of Patch Tuesday?
And speaking of Redmond, we're told Microsoft does a lousy job encouraging its corporate customers to use multi-factor authentication (MFA). So does Twitter, which is shortly going to switch off SMS MFA for everyone except Blue subscribers; all tweeters can use free and frankly more secure alternatives, such as Google Authenticator, to log in.
Apple claims 95 percent of its iCloud users enable MFA, Easterly said. For comparison: Twitter reports fewer than three percent of its users turn on any type of MFA, while Microsoft puts the number at about 25 percent of its enterprise customers — and only about one-third of those companies' admin accounts use MFA, Easterly noted.
"Apple's impressive MFA numbers aren't due to random chance. By making MFA the default for user accounts, Apple is taking ownership for the security outcomes of their users," Easterly said, adding that even though Twitter and Microsoft's MFA percentages are "disappointing," at least they publicly disclose this data.
"By providing radical transparency around MFA adoption, these organizations are helping shine a light on the necessity of security by default," Easterly expounded.
"More should follow their lead — in fact, every organization should demand transparency regarding the practices and controls adopted by technology providers and then demand adoption of such practices as basic criteria for acceptability before procurement or use," she added, calling on manufacturers to be "transparent" with their vulnerability disclosure policies, to protect security researchers who find and report these bugs, and to fix the root cause of the security flaws.
Build security in
Making software "secure-by-design," and thus putting the liability on the vendors to sell safe products out of the box instead of pushing that responsibility on to consumers and businesses, is a drumbeat that CISA has been pounding under Easterly's leadership.
In addition to things like turning on MFA by default, here's what this looks like in practice.
"Security-by-design includes actions like transitioning to memory-safe languages, having a transparent vulnerability disclosure policy, and secure coding practices," Easterly said.
Using programming languages like Rust, Go, Python, and Java (instead of C and C++) can eliminate memory-safe vulnerabilities, which currently compromise around two-thirds of all known software vulnerabilities, according to CISA.
- The truth about that draft law banning Uncle Sam buying insecure software
- Feeling VEXed by software supply chain security? You're not alone
- Google says Android runs better when covered in Rust
- US cyber chiefs: Moving to Shields Down isn't gonna happen
Memory safety bugs — such as out-of-bounds reads and writes or use after free() — also increase the cost of software development when not caught early.
Easterly cited Google's recent announcement that Android 13 is the first release where the majority of new code added was written in a memory safe language — Rust, Java, or Kotlin. And, she added, quoting Google, "'There have been zero memory safety vulnerabilities discovered in Android's Rust code.'"
Additionally, Mozilla, which championed Rust, is working to integrate that language into Firefox, and Amazon Web Services is also building cloud services in Rust. ®