Back in 1979, just out of grad school, I got my first real job working as a software developer (we used to be called “programmers” in the olden days). I learned the C programming language then and did my software development on a Digital Equipment Corporation PDP-11/70 minicomputer running 6th edition UNIX. And for the first time, I was using a multiuser computer which could be crashed by a non-privileged user doing non-privileged things. I mean, an error in Charlie’s or Mike’s (never mine!) code could crash the whole machine. “Why doesn’t the operating system protect against such things?”
As I said, that was 1979. 30 years later, in September 2009, my friend David Strom, citing a report from the SANS Institute on top cyber security risks, wrote “Unpatched applications are the real threat.” Are they? I reiterated what I asked 30 years ago: “Why doesn’t the operating system protect against such things?”
Quoting the SANS report, he writes, “Most web site owners fail to scan effectively for the common flaws,” and “TippingPoint reported that vulnerabilities that were more than two years old were still waiting for patches.”
Patching? The SANS report is a good reminder of what can and should be done in the short term. But, it is clear that, while patching is useful (I won’t even write “important”), it should only be important for data integrity or program availability, not for security of the data or of the system. The hardware and software system should protect against such things. The computer science world has been flirting with “trusted computing” and trusted operating systems for years. The “real threat?” Operating systems we still cannot trust to effectively control and contain user-level applications.
12/23/09
Subscribe to:
Post Comments (Atom)
1 comment:
In any case, "Unpatched applications" may be a vulnerability but they are not a "threat."
Post a Comment