This article is more than 1 year old
When DevOps and cyber security collide
How can the IT industry stop the CVE rot and get software bug numbers down?
Sponsored Feature Security bugs in code are a bit like reality shows and Instagram influencers: an irritant we were once willing to tolerate which has subsequently grown to become a much more serious problem.
The National Vulnerability Database (NVD) classified fewer than 2,000 common vulnerabilities and exposures (CVEs) in 2001. Last year, that number hit 20,000 for the first time. That's partly because we're better at detecting them than we used to be, but it's also due to a proliferation of software. It brings the problem of application security home. So how can we stop the rot and get those bug numbers down?
Madou, co-founder and CTO of company Secure Code Warrior, says that security has lagged behind in the software application space. "In the last 10-15 years we made improvements in security in other areas such as network and endpoints," he says. "Application security didn't move that quickly."
Part of the problem was that developers had so many other issues to address which generally took precedence. Secure Code Warriors' 2022 State of Developer-Driven Security survey found security ranked last as a priority among a poll of developers, cited by just 14% for example. While regular headlines highlighting flaws in code which led to data breaches mean they can no longer afford to push security down the list, the origins for that attitude are to some extent cultural adds Madou.
Throwing code over the wall
Application security and software development have not historically been integrated. Software developers handled the coding, while a second group of application security people was responsible for ensuring that software behaved properly. This meant coders saw security as someone else's problem.
To make things worse, the relationship was often dysfunctional. "Those two groups didn't get along very well because application security people were pointing out problems and telling the developers how bad their code was," he says.
The groups also worked on an ad hoc basis than in a regular, structured way. Waterfall software development practices exacerbated that problem because the teams handled application security at the end of the software development process.
A software scan might provide results but left little time to act upon them, for example. Instead, companies had to triage vulnerabilities. The worst were dealt with as best as the team could while trying to get software over the line. The rest ended up as technical debt for another day.
The Secure Code Warrior survey suggests that developers prioritise paying down that technical debt. It's a difficult line to walk, because they have to pick off the worst security issues while trying to meet functionality requirements from the business.
"They have a baseline which means 'we cannot introduce more problems, but we're not going to reduce the technical debt either'," Madou explains.
This traditional approach to software development put cybersecurity last by enabling both sides to avoid responsibility. It created the perfect breeding ground for software vulnerabilities, which become difficult to patch in a waterfall development process.
If a stitch in time saves nine, then the reverse is true; leaving it too late creates an expensive hole. Fixing software at the very end of the production cycle is far more costly.
"If a security bug was critical enough you'd get an off-cycle release," says Madou. That causes more overhead, because the bug would be kicked back to engineers that might not have created it in the first place. "So they must then make themselves familiar with the code and the problem before finding a solution."
This reactive approach to security also affects software reliability, warns Madou. "Software security correlates with quality," he points out. If one suffers, so does the other. Quality, incidentally, was another top priority in Secure Code Warrior's developer survey.
The rise of agile and DevOps
Agile development introduced shorter iterations which enabled developers to increase their release cadence. While this was a good thing for developers, it left security lagging behind. More frequent releases call for faster scans, which app security teams were not used to.
This new requirement changed the way that app security teams had to measure their performance. Metrics changed, from simply reporting how many problems security pros could find in the code to how quickly the devs and the security team could jointly solve them.
"It also meant that the application security team needed to know how to program," explains Madou. Recruiters had to start looking for people with a developer background rather than recruiting network experts, for example.
Those that did it right could handle fixes inline, closer to the source of production, but it wasn't easy. "The transition doesn't happen overnight and not just by using buzzwords," he says. "A lot of people used all the right terminology but didn't ship quickly."
The evolution of DevOps culture shifted things again, speeding up release cadence still further. It bought positive and negative implications for application security, explains Madou.
Some DevOps cultures encouraged a 'move fast and break things' approach that affected software security, for example. A common refrain is that you can always test a feature and roll it back if you don't want it, though in truth people often don't.
"This means that now suddenly, because of that rapid development and production and trying stuff out, you take problems into production," says Madou. This is a particular issue when using immature continuous integration and continuous deployment (CI/CD) pipelines that lack sufficient gated security measures.
On the other hand, DevOps theoretically made developers more responsible for what they'd written as they now had to manage it in production. It also shifted security even further forward in the software development process, prioritising it in the design and early coding stages, creating the platform for a more integrated approach to application security.
The need to teach coders
But process changes alone aren't enough to create security-conscious developers, Madou warns - education is an important factor. Sadly, traditional university programs don't explore this discipline deeply enough.
"Teaching application security is a hard problem because every language and framework behaves differently," he explains. That makes it an on-the-job education process. "It's up to the organizations to upskill new employees in the security aspects of the language, framework, and libraries they're using on a daily basis to build secure software."
In practice, many companies don't have the time or funds to teach secure software development. They just want employees to code and ship. Where there is training, there's often little certification involved to prove practical, hands-on secure coding experience.
This is where Secure Code Warrior comes in. Madou, who has a PHD in application security, worked at a security company that specialised in finding security bugs in code. "Seven years in I started to realise that it's not that hard to find problems in code," he recalls. After all, both black hats and white hats do it all the time. "But if you don't teach the developers to code securely the bugs will keep coming," he says.
Secure Code Warriors aims explicitly to educate developers on the job and promote secure coding. Its Learning Platform is an education system that features educational resources in 60 computer languages, from C to COBOL. These don't just cover the ones handling control flow; the platform also includes declarative languages used to set up infrastructure-as-code environments, for example. It aligns these courses along guided learning pathways to help educate developers in specific vulnerability types.
The Learning Platform uses assessments to check on how well the developers are assimilating this information, and includes a gamification element. Developers can undertake coding simulations to test out their skills in real-world scenarios.
Secure Code Warrior takes gamification further with tournaments where coders can compete with each other to demonstrate their secure coding practices. The leader board can highlight potential candidates to direct the application security effort on the developer side.
The company also offers software integration with bug-finding tools like GitHub's CodeQL, using text searches from scan reports to find educational resources on the vulnerability in question. That helps coders to fix the flaws on the job.
Madou says that mature companies often use the platform to help pay down the security issues that languish as technical debt. He advises them to identify a specific problem that threatens their organisation, such as SQL injection flaws created by a team of unwitting developers over the years.
"We try to figure out what is the crown jewel in the organization and what is its biggest threat," he says. "Then we like to roll out the program and make sure everybody gets upskilled." That equips the development team to fix the biggest holes in its code base first, eliminating the big risks. Then, the company can repeat the process until it has dealt with all of its most pressing application security issues.
Organizations must make the case for the platform before they can use it. "Compliance is a good starting point to introduce something that can benefit the entire organization for the long term," Madou adds. However, ticking compliance boxes should be the beginning of the platform's value, not the end. When used properly, it can help a company to go beyond regulatory expectations and drive security throughout its software development life cycle.
The road to secure code is long and arduous, and companies might find themselves having to drag their developers along on the journey. When it comes to instilling the right practices and values to prevent the next headline-generating data breach, every little helps.
Sponsored by Secure Code Warrior.