The article is in response to the SANS Insitute's Top 25 Most Dangerous Programming Errors. This list covers a bunch of fairly common, mostly security-related, errors that have been found in most modern applications.
I can't comment much on the list itself as I have not yet had the chance to review the problems that they cover in detail just yet. At the moment, I'm more interested in Mr. Wolfe's article, which I find to be way off the mark in it's observations about modern software development:
You old-timers -- programmers who worked in the business before the PC industry kicked the waterfall development model to the curb -- know what I'm talking about. That waterfall model was replaced by a process (and I use that word loosely) where the modus operandi was to cram in as many features as possible before the shipping cut-off date, and then fix the problems in beta. (Sure, I know time pressures mean waterfall wasn't rigidly adhered to, and also that it had deficiencies, leading to the 1980s flowering of alternatives like agile-development and object-oriented programming. But at least we had a model.)Leaving out his mention of OO programming -- which is a language construct that was created to help manage software complexity and not a software development methodology in and of itself -- he seems to think that waterfall was the ideal way to develop software and a sure-fire way to avoid the mistakes that were iterated by SANS.
Even more disturbing, is that he later laments that more developers should read Fred Brooks' The Mythical Man-Month, which leads me to believe he didn't really read the book very thoroughly. If anything that book actually works against the very argument he's trying to make here.
Brooks' entire premise in The Mythical Man-Month was managing complexity: As software systems grow, they become harder to design and build. He was exploring methods and organizational approaches to deal with that complexity in an efficient and cost-effective manner, because his observations from working on IBM's OS/360 had shown him that modern methods of software development just weren't cutting it.
Remember, this book was written thirty years ago.
And what was most common software development methodology used back then? Waterfall.
Granted, Brooks' focused more on the organizational side of the equation -- how to organize teams of developers to manage communication and help with the systems integration -- but he recognized that current approaches just weren't working as well as was needed...and he saw this over three decades ago.
Now here we are in modern times and software systems haven't gotten any less complex than they were back then -- in fact they've gotten much more complex as technology as advanced. And the pace at which technology advances just multiplies the difficulties modern developers face. And anyone that's even tried to take the BDUF approach (as I have) has seen how horribly bad it can be in practice in this day and age. Now I'm not going to sit here say that Agile, or any other methodology, is the absolute answer to this problem. But at this stage in the game, most software developers have recognized that waterfall is woefully inadequate to modern software development.
But even with the "right" methodology in place, that won't stop the crappy software that Mr. Wolfe is complaining about. Because it's not just the process, but the execution of that process that needs to be addressed.
And this is where I think Mr. Wolfe completely misses the mark on where the problems of crappy software really lie. As a software developer currently (which puts me in a much better position to make these observations) I can say without a doubt most of the problems that he has with current software systems lie not with software developers, but with management.
Just about every software developer I've known in my life is smart, intelligent, driven and above all desires to do a good job. They all want to write great software.
The problem is that, in far too many cases, they're not allowed to.
When a developer insists on doing more testing (or refactoring, or redesigning) of some batch code, only to have his manager tell him he's wasting valuable time, exactly who is at fault for this?
When managers don't care about anything but the bottom line, or being first to market, or say "just get it done", is that the developer's fault?
When managers insist on focusing developer efforts on less vital features, or tell them to ignore stuff like security, or usability, is that the developer's fault?
When managers concern themselves only with lines of code, or hours worked, and not on features completed, or play political games with their project in an effort to make themselves look good, or their developers look bad, rather than ensure that a piece of software actually works right, is that the developer's fault?
According to Mr. Wolfe it is.
No wonder so many of us are getting burned out.