The other day, in a presentation about tools to enhance the collaborative use of information and computing resources, an executive made the following statement (which I’m paraphrasing here):
Since 1990, the amount of computing power we can put on a worker’s desktop has increased by a factor of about 200. During the same period, the average worker’s productivity has increased by a factor of 1.2 to 1.4. Something’s wrong with this picture.Assuming, for this discussion, that he’s right about those figures, let’s look at the situation a bit.
First, the productivity numbers there refer to gains “in GNP sorts of terms” — that is, they’re measuring the average worker’s productivity by that worker’s contribution to the bottom line, and by how much the bottom line has changed over that time. But we’re measuring the computers purely in terms of raw computing power, so it’s not a fair comparison: a hot, top-of-the-line PC today does not cost 200 times what a PC cost in 1990. In fact, the opposite is true: the cost of computers has dropped, while their power has gone up. At some level, we are now spending less than we were in 1990, and getting 20% to 40% more out of it.
OK, that said, I think there is something of interest in the observation. Two somethings, as I see it.
One is that our jobs have become increasingly complex, as we’ve been asked to juggle more high-tech stuff — more hardware, more software, more connectivity, more data, more access to more things. I wonder, sometimes, what the Flying Karamazov Brothers would do with all that. What used to be jobs involving filing, taking messages, dealing with phone calls, and the like have turned into jobs as information hunters and gatherers, filling our desks, our heads, and our time with the collective knowledge of humanity.
The other is how much time we spend waiting for all that to happen.
Twenty-five years ago, my colleague Walt Doherty, now retired, studied response time on computer systems, and its effect on user productivity. Walt and his co-author, Arvind Thadani, looked at the bottom line, at where the costs trade off. His conclusion?:
The potential benefits for an organization in providing improved and ultimately subsecond response time for online computing include substantial cost savings, improved individual productivity, shortened project schedules, and a better quality of work. These benefits are inherent in the computing situation; they do not depend on the type of work being done, as will be demonstrated by the diversity of the environments in which they have been demonstrated.
Now, their work was done in the days of mainframe computers and limited networking. While the translation into today’s world of personal computers, web browsers, and the world wide web isn’t straightforward, their basic points must still be valid:
This phenomenon seems to be related to an individual’s attention span. The traditional model of a person thinking after each system response appears to be inaccurate. Instead, people seem to have a sequence of actions in mind, contained in a short-term mental memory buffer. Increases in SRT [system response time] seem to disrupt the thought processes, and this may result in having to rethink the sequence of actions to be continued.That is, our productivity is hurt when our attention is diverted from the task at hand, the extent of the damage is much greater than the extent of the diversion, and waiting for the computer to respond is such a diversion.
We expended a great deal of effort on the mainframe systems to tweak and tune, and we purchased sufficient computing power, in order to get the response time down to where the diversion didn’t have a severely negative effect. If Walt’s study showed that ¼-second response time was ideal, well, we might have had to settle for ½ of a second, or ¾, but we did the best we could at balancing the cost of computing resources with the cost of interfering with the work we were trying to get done.
PCs changed everything. Initially, we no longer had to share our computers, so someone else’s work didn’t slow things down for you. On the other hand, as we developed operating systems, applications, and user interfaces, many things became slower and slower over time. To the extent that individual operations accomplish more now, that’s OK. But it’s often true that something we used to do by typing a command is now done by clicking a button or two, and the slowdown is noticeable.
But what really broke the slowness barrier was the very thing that put the world at our fingertips: the world wide web.
What started as a network of text pages and hyperlinks to other text pages, with a few photos and graphics mixed in, quickly shifted in character. Predictably, understandably, as more people put more stuff up there, and as commercial interests took over a good portion of things, content and layout became everything. It’s rare to find a plain text web page now (here’s one, at least at the time of this writing, presented here for your amusement (and it might further amuse you that I bookmarked this page at least twelve years ago, and it hasn’t changed since)). Most are full of backgrounds, animated graphics, Flash items, frames, style sheets, and so on. Some of that makes for very nice web pages, beauty to behold. But there’s no doubt that it makes for far longer page-load times, even as we’ve cranked our network speeds up ever faster. Who isn't used to web pages that take many seconds to load? Subsecond response time is not an expectation of web surfers.
What’s more, we have to learn to navigate these web sites, some of which are quite complex, some of which do not make it terribly easy to find what you’re looking for. By the time you’ve clicked around for a while, each time waiting for new pages to load, you might be ready to toss the whole thing into the bin... and still not have found what you set out to.
And on top of all that, we’ve made simpler user interfaces that are paradoxically harder to use. It’s easier, we say, to “point and click” than it is to type some arcane command. That’s as may be, provided you know what to point at and when to click. When what you’re looking for is buried under several layers of menus, or amid an impossible array of web pages and sub-pages, it doesn’t amount to being very simple after all. And for those who think that the answer is to provide shortcuts for the savvy, remember that most people are not “savvy” in that regard. A vanishingly small set of users will use shortcuts; you don’t get a free pass out of application-designer hell for having put in a bunch of those.
In the end, as much as we may think “collaborative applications”, the interlinking of related information, and similar things will “improve productivity”... if we don’t get back to simple user interfaces, application and data design that makes it easy to find what we’re looking for, and fast access to the information once we find it, we will not improve much.