paint-brush
For secure code, maintainability mattersby@sonarsource
211 reads

For secure code, maintainability matters

by SonarSourceOctober 25th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

For secure code, maintainability matters for secure code. The Heartbleed Bug was a serious vulnerability in OpenSSL that allowed attackers to steal sensitive information. Experts: Bugs and Code Smells are security 'weaknesses' in the CWE, which makes the case for both maintainability and reliability as contributors to security. The CWE is a crowd- (of experts) sourced list of common software and hardware weaknesses that have "security ramifications" It has about 1,300 entries, including quite a few that are used for categorization.

Company Mentioned

Mention Thumbnail
featured image - For secure code, maintainability matters
SonarSource HackerNoon profile picture

Author Robert Collier said that "Success is the sum of small efforts repeated day in and day out." That's especially true when it comes to security. By now we all understand that securing your systems isn't as simple as installing a firewall and calling it a day. Instead, it's multiple actions and strategies in concert, implemented consistently over time. And believe it or not, one small but important strategy is simply writing code that's reliable (bug-free) and maintainable (easy to understand). Yes, I know that sounds too simple, and possibly even self-serving. So in this post I'll lay out some of the evidence for how writing reliable and maintainable code means you're inherently writing more secure code.

Poor maintainability contributed to Heartbleed

To make the case for how maintainable code contributes to security, I'll start with the Heartbleed Bug. Remember that one? It was a serious vulnerability in OpenSSL that allowed attackers to steal sensitive information with a really trivial attack that XKCD illustrates beautifully. David A. Wheeler teaches a graduate course in secure development at George Mason University. He wrote an extensive analysis of the vulnerability. In it, he laid part of the blame on the difficulty of simply understanding the code involved: "Many of the static techniques for countering Heartbleed-like defects, including manual review, were thwarted because the OpenSSL code is just too complex. Code that is security-sensitive needs to be 'as simple as possible'."

When the Heartbleed Bug was eventually found, it was actually detected by human review rather than static analysis. It's worth noting explicitly here that the problem wasn't caught in peer review, but long after the merge by independent security researchers. In his analysis, Wheeler discusses why Heartbleed wasn't found sooner. "Little things like code formatting matter," he says, "since badly-formatted code is much harder for humans to review." Code Smell / Maintainability rules for things like code formatting and naming conventions are often dismissed as trivial, maybe because they're about things so foundational that people take them for granted. As Wheeler points out, that doesn't mean they're not important. 

Wheeler suggests that attention to maintainability leads to more secure software, and continues that "The goal should be code that is obviously right, as opposed to code that is so complicated that I can’t see any problems." And of course, that's what Code Smell rules help you do - write code that's maintainable and easy to read so that it's possible for it to be "obviously right". 

Experts: Bugs and Code Smells are security 'weaknesses'

Of course, Wheeler's just one person, and opinions are like belly buttons, right? So let's look at another source: the CWE, which makes the case for both maintainability and reliability as contributors to security. 

I want to start with the 2020 CWE Top 25 Most Dangerous Software Weaknesses, which is an expert-sourced subset of the CWE. But first, some background: CWE stands for Common Weakness Enumeration. It's a crowd- (of experts) sourced list of common software and hardware weaknesses that have "security ramifications". It has about 1,300 entries, including quite a few that are used for categorization. The Top 25 is a list of "the most common and impactful issues experienced over the previous two calendar years." Given this build-up it would be reasonable to assume that all 25 CWEs in the list describe security vulnerabilities. But by my count, nearly a third are bugs. Bugs that could lead to security breaches, but bugs nonetheless. For instance, lucky number 13 in the list is CWE-76, NULL Pointer Dereference. 

In fact by one count, about 60% of CWEs aren't vulnerabilities at all. CWE-699 is the  Software Development view. It "organizes weaknesses around concepts that are frequently used or encountered in software development". It contains 40 sub-categories, including  Complexity Issues, Numeric Errors and Bad Coding Practices. Of the 59 leaf listings under Bad Coding Practices, the first is the beautifully emblematic CWE-478, Missing Default Case in Switch Statement. 

This is not a rule most people see as important for Code Security. At SonarSource, we don't even class it as a Bug, but as a Code Smell / Maintainability problem. But its inclusion in the CWE says that experts in the field see it as important for security. Because the small consistent efforts like providing `default` clauses help you write "code that is obviously right". 

Maintainability + reliability -> secure code

With the Heartbleed Bug we saw that maintainability matters for secure code. If you can't understand the code, then you can't be sure it isn't hiding vulnerabilities. Add the CWE into consideration, and you see that experts in the field agree that both maintainability and reliability are important contributors to code security.

To be clear, static analysis didn't find the Heartbleed Bug. And there will always be bugs and vulnerabilities that static analysis can't find. For instance, there's no way to know statically that you shouldn't allow a negative integer as a shopping cart quantity. That's the job of peer review. In the case of Heartbleed, peer review failed because the code wasn't understandable / maintainable. 

Static analysis didn't find Heartbleed, but it could have put the peer reviewers in a better position to spot it on their own. Static analysis tools such as SonarQube and SonarCloud can easily find maintainability and reliability issues, as well as lots of valuable security issues such as injection vulnerabilities, XSS, XXE, deserialization flaws and more. When all three categories are implemented consistently over time, they'll help you gain control of your code's reliability and security.

With apologies to Robert Collier, perhaps Security success is the sum of small Maintainability efforts repeated day in and day out. Because maintainability matters. Especially for code security.

Previously published at https://blog.sonarsource.com/for-secure-code-maintainability-matters