Wednesday, September 19, 2007


Aspects of computer security

We often talk about “security” with respect to computers and computer networks as though it were a clearly defined, monolithic concept. It’s really not; there are a number of aspects to it, and sometimes, in differing contexts, we mean one aspect or another, or some varying combination. I thought I’d do what will probably end up as a series of posts about aspects of computer security.

I loosely split the general topic into this list of sub-topics (with links to the subsequent posts in the series):

  1. Authentication — Who am I, and how can I prove it?
  2. Authorization — What am I allowed to do?
  3. Access control — What data am I allowed to get to, change, create, delete?
  4. Privacy — Are communications and data safe from unauthorized viewing?
  5. Integrity — Are communications and data safe from unauthorized modification?
  6. Non-repudiability — Is there an undeniable connection between users and the data they created?
These aren’t absolute — one can certainly come up with a different set, or choose to add things to the list or remove some — and there’s overlap (say, between authorization and access control, or between privacy and integrity). But I think it’s a good list to work from.

I want to point out specifically, that “encryption” is not “security”. Encryption is a tool that can be used to establish one or more aspects of security. In fact, we generally use encryption in authentication processes, to establish non-repudiability (encryption is used in digital signatures), and to ensure privacy and integrity.

In this series I’ll also talk about some of the “threats” that computer security people try to defend against. A threat that I’ve been thinking of today comes to me from this NY Times editorial, about investigation into overloading telephone lines for a political purpose:

The Bush administration has spent a lot of time talking about mythical cases of voter fraud and election improprieties, but the New Hampshire phone jamming case was the real thing. Republican operatives hired an Idaho telemarketing firm to jam the lines to prevent people who needed help in voting from getting through. The scheme was a direct attack on American democracy.

The scheme was also what we call a “denial of service attack”, or a “DoS attack” for short. In a DoS attack, the idea is for the attacker to demand so much service that there’s little or no opportunity for legitimate users to get any. The one described in the editorial isn’t computer-related, but DoS attacks on web sites are very common, a popular way for a group to try to block a web site that it doesn’t like.

We sometimes specify a distributed denial of service attack (DDoS) — think about the difference between one phone calling repeatedly with the re-dial button, as opposed to thousands of members each calling (distributed) — but essentially every Internet DoS attack these days is distributed, so the distinction is mostly unimportant.

Defense against denial of service attacks can be difficult, because it’s often hard to determine which service requests are the legitimate ones. Rate-limiting and block-listing are probably the most common mechanisms. Certain Internet addresses (IP addresses) are known to be bad, and are blocked outright — all contact from them is discarded. Other addresses are allowed to make requests, but if they make too many requests in too short a time, they, too, are blocked (usually for some period of time, though repeat offenders might be put on a permanent block-list).

Spam can be thought of as a DoS attack: if your inbox fills with enough junk, it might be impossible to find the real mail. Worse, spam filters, designed to defend your inbox, might mis-classify some mail as spam and delete it. Spam isn’t generally meant to have that effect — something can become an unintentional denial of service attack.

Another example of an unintentional DoS attack (outside the realm of computer networks) was the carrying of small scissors onto airplanes, when they were not allowed. The TSA eventually allowed them, saying that their screeners were spending so much time confiscating scissors that it affected the time they had to screen for more important things.

No comments: