On secure coding training
Can secure coding be learned? Can it be taught?
What happens when we appropriately resource and incentivize developer training?
The problem (maybe)
So why does the tech world have a cybersecurity problem? If you believe that “software is eating the world” 1, then perhaps much of the answer lies in how we develop software.
To the extent that an existing vulnerability is a security bug that was unwittingly written into code by a developer, how many bugs can we prevent from being introduced by improving secure coding competency?
After all, if the well-meaning developer knows to (and how to) write secure code, under what circumstances would vulnerable code still be produced?
“Videos & quizzes only” is a trap
Assuming it is offered at all, secure coding training is often conceived as a check-the-box compliance activity, so it gets resourced accordingly and lightly implemented, typically as videos & quizzes only.
So what can a developer realistically get out of such an approach?
They certainly practice watching videos and taking tests, but it is unclear how well this activity translates to increased secure coding competency. Arguably, developers are practicing the wrong thing.
Left of SAST
As an application security engineer working in DevOps, I try to help teams by implementing self-service infrastructure (i.e., platform engineering) that puts alert data into developer hands at “the right time”, ideally in phase with their standard workflow and when newly written code is still fresh in mind (bugs are cheaper to fix then).
This primarily leverages SAST 2 at the pull request, which is great for what it is 3 but are there meaningful things we can do earlier in the SDLC – especially to avoid writing the bug in first place?
That got me thinking about whether a more thoughtfully designed secure coding training program has untapped potential, especially in the context of modern DevSecOps with its platform engineering focus.
Testing the hypothesis
Some may know that I inherited a “videos & quizzes only” secure coding training program (a good one for what it was) a few years ago and re-designed and re-implemented it this past year around an innovative new platform called SecureFlag, which provides high-quality hands-on labs, a “hack it first, then fix it” approach for developers, and a way to measure and score secure coding competency.
Based on pilot results so far, I have reason to be cautiously optimistic, but will need to figure out how to connect anticipated increases in secure coding competency to things we care about. That’s trickier as application security does not (yet??) have the equivalent of what DevOps has with DORA metrics 4.
Fingers crossed.
– JW
Footnotes
-
See Marc Andreessen in 2011. Or more recently Satya Nadella in 2019 in a similar vein. ↩
-
Static application security testing. In other words, scanning app code for known vulnerabilities. ↩
-
As a timely safety net to catch known security bugs that we can detect through static analysis of code. Ideally these are fixed before the branch can be merged. ↩
-
See Nicole Forsgren’s book Accelerate. ↩