My Love/Hate Relationship With Static Code Analysis Tools

I have a love/hate relationship with static code analysis tools. On the one hand, they’re great at helping teams eliminate substandard code practices, especially those that may potentially result in security problems. On the other hand, they’re widely misunderstood and misused, as well, and most often often by management.

On the face of it, static code analysis tools like Fortify (a commercial tool — probably the one with the most name recognition) and Brakeman (the best tool for Ruby/Rails code) seem like an awesome idea. Rather than relying on diverse software development teams throughout the organization to properly and regularly follow best practices for coding and handling application security, let’s encode the best practices into a tool that will parse and analyze the team’s code and point out deficiencies.

Let’s parse that last sentence above a bit more closely. There are two phrases that are potentially problematic: “best practices” and “deficiencies.”

Who decides on the best practices that need to be followed? And who says that code is deficient if it doesn’t meet a best practice that the tool checks for?

The reality is that it requires judgement to assess whether a “deficiency” pointed out by a static code analysis tool is actually a problem that needs to be addressed. The only people with the technical understanding to exercise this judgement are typically the members of the development team.

It could be that the tool is overzealous in pointing out a particular type of flaw. This is true, for example, with Brakeman, which essentially attempts to stamp out any instances of metaprogramming. Brakeman flags every instance of dynamic code generation using eval( ).

Using eval( ) is not evil. It’s part of the Ruby language for a reason. It is potentially problematic if the code being executed by eval( ) is derived from user input, since a nefarious user could try to compromise the system. But determining whether the use of eval actually is a problem requires judgement, which in turn requires an understanding of the technical context of the code.

As a developer, I want to write the best code possible. I love static code analysis tools because they help me identify potential problems in my code. I’m capable of exercising the judgement to determine what does, and does not, need to be changed.

However, that determination is often not in my hands, and that’s when I start hating this type of tool. You see, management loves static code analysis tools.

Fundamentally, management doesn’t understand what software developers are doing, which is uncomfortable for management. Ultimately, management is responsible for what software development teams create, but typically doesn’t have a deep understanding of the end result.

Accordingly, management clutches on to static code analysis tools like a drowning man will latch onto a life preserver. It’s a way for them to verify that good software is being produced. It’s like the ultimate checkbox for them.

This is how using Fortify becomes an mandated best practice for government projects. Even for Ruby products, and even when the Fortify module for analyzing Ruby code is just simply awful.

Even worse, once the use of static code analysis tools become mandated, then other rules get added on. “You must submit a copy of your Fortify results with each software release.” And my favorite, “You must provide a detailed, written justification for each deficiency that is not corrected.”

This is where I start to hate static code analysis tools. Because now I have to waste valuable time to write a justification for a non-technical person who generally really doesn’t understand the issues on why I didn’t fix a “deficiency” in the code. Management often doesn’t understand that “false positives” happen or that dynamic coding (in the case of Ruby) is useful.

This is why I have a love/hate relationship with tools like Fortify and Brakeman. I love them because they help me, and my teams, to write better, more secure code. But I hate being bludgeoned by their results when the capacity to properly evaluate their output is taken away by management.

In the Slideshare Top 5% for 2014

Top Five PercentI’ve got twenty of my presentations up on Slideshare. My most popular presentation of the year was Social Networking: The Next Weapon Against Bad Actors, which is about how social networking technologies can be leveraged to assist in cyber security. I first presented this at the government’s GFIRST Cyber Security Conference.

My second most popular presentation was 21st Century Writer. This one discusses changes in the publishing industry brought on by the advent of electronic books.

Speaking at DevIgnition

Speaking at DevIgnition
My friend and fellow conference organizer, Gray Herter, just sent me a picture he took of me speaking at the DevIgnition Conference at Booz Allen Hamilton’s Newman Auditorium in McLean, VA. This is another of the 37 plus places that I’ve spoken at, although it was a very, very nice venue. There were roughly 150 attendees present at the event.

Interesting Email Chain

I recently participated in an interesting email exchange that I thought would be fun to share with the world. The exchange started with the following seemingly innocent request for help:

From: Frederick, Rob
To: Crow, Drew; Johnson, Quentin; Keener, David
Sent: Tuesday, November 18, 2014 9:42 AM
Subject: Task – IE Testing

I need to test the user interface for my new web site feature under IE 9 and higher. I need some help from the team in order to test this.

An almost instant response came back from another member of the software development team. Note that, except for my own name, the other names have been changed to protect the innocent.

From: Johnson, Quentin
To: Crow, Drew; Frederick, Rob; Keener, David
Sent: Tuesday, November 18, 2014 9:42 AM
Subject: RE: Task – IE Testing

This is directly on the Microsoft Website:

“Windows Internet Explorer 9 lets your websites shine and perform just like native applications on your PC.”

What more of a guarantee do you need?

A few minutes later, the first test results came back from another one of the developers.

From: Crow, Drew
To: Frederick, Rob; Johnson, Quentin (NE); Keener, David
Sent: Tuesday, November 18, 2014 9:51 AM
Subject: RE: Task – IE Testing

Line: 34294 Error: [$injector:modulerr] Failed to instantiate module ng due to: Object doesn’t support this property or method http://errors.angularjs.org/1.3.0-beta.17/$injector/modulerr?p0=ng&p1=Object%20doesn’t%20support%20this%20property%20or%20method

Is this what “shining” looks like?

I’ve never been one to resist a perfect setup, so I responded with a one-word killer response:

From: Keener, David
To: Crow, Drew; Frederick, Rob; Johnson, Quentin
Sent: Tuesday, November 18, 2014 9:52 AM
Subject: RE: Task – IE Testing

redrum

It’s always fun when I can combine my passion for software development with my love for movies (and books). It’s even more fun when there’s a delayed reaction as people one-by-one figure out my joke.

Workshop at Capclave

I’ll be conducting a 2-hour worksop entitled “Public Speaking for Writers” at Capclave 2014. On October 12th from 2:00 PM to 4:00 PM.

Hard-Working Attendees of David Keener's Workshop

Note that authors Tom Doyle and Donna Royston attended the workshop.