Cybernetic Entomology: Security Focus
Many of the Cybernetic Entomology exhibits I’m putting in this yet-to-be-published book are security bugs. So many, perhaps, that a lot of people will think of this as a book about security more than it is a book about software quality.
This may be true. But, while most of the bugs in other areas seem to me to be at least understandable results of people making a best effort but not managing to get everything right, security flaws in particular seem mostly to have a special and horrifying quality, which I’ll sum up as “What the hell were they smoking?”
Security problems, more than any other kind, seem to arise from developers or companies not even trying to maintain security, or even deliberately subverting it. I find that offensive to the entire profession of software development. The billions or trillions of dollars lost annually worldwide to crooks, scammers, and thieves who exploit the holes buggy software or buggy hardware leaves behind when it breaks security or fails to achieve it are our own steaming turd, a disgrace lying at the feet of our entire industry.
Computer security requires broader thinking than we are used to in the software development industry. In most applications we are not cooperating with other software vendors and trying to make sure that something we want to happen, happens. In software security, it is our job to ensure that something our clients do not want to happen, does not, and in order to do that we have to cooperate with other software vendors. Most people who write about software security write about the distinction between trying to make something happen and the broader challenge of making sure something does not happen.
But I want to address the other side of this coin. In most applications, we are serving our own interests in providing functionality for our own programs, and if we can do something that security prevents our competitors from doing, that’s an advantage. In software security, we are serving our clients by cooperating to maintain barriers that deny some functionalities to all programs including those that would abuse them. The distinction is professionalism — whether we work independently to serve our own interests, or cooperate with each other in the interests of people who depend on us.
Refusing to cooperate in the interest of others, and placing our own interests ahead of those of the people who depend on us is bad behavior. In fact, it is exactly like being a crooked investment counselor or a snake oil salesman. It is unprofessional. It is precisely what professional standards of behavior in Engineering, Medicine, Law, Accounting, and Investment Counseling are intended to prevent.
Considering that the people who depend on software are now as great in number (ie, everybody) and arguably even greater in exposure to harm from our failures, it is past time that we live up to some professional standards of behavior. And that is why “not even trying” is simply unacceptable as an approach to software security.
Collectively, we are the only industry — from CEOs to engineers to sales executives — that can fix software security. And whether any particular part of the problem is our fault or not, the solution is our responsibility. That’s what being a professional means.
So, yes, I’m focusing more sharply on security bugs than a completely general examination of bugs would. This is because, as I see it, computer security is the greatest failure of our industry to behave professionally and possibly the single most persistent and costly hazard of the modern world.