I'd like to share an idea that has been brewing in my head for a few months now. Ever since I read The Inmates Are Running the Asylum by Alan Cooper, I've started noticing something in how "computer literate" people behave with respect to new, presumably better software. You don't have to buy and read the book, since I will paraphrase for you the relevant parts. It might take a paragraph or two, though, so bear with me.

I'm going to start by defining some terminology that Alan uses in his book regarding two types of software users: survivors and apologists. The fundamental argument of Alan's book is that software systems are designed in a way that disregards human sensibilities, creating artificial difficulties in how humans use applications. There are two approaches that people take when using these difficult systems.

The first is those who, in trying to work with strange software, learned the strange rules that drive software and are complacent (if not comfortable) with the arbitrary nature of computing. They are called apologists because they needed to learn about computers in order to understand the software they use and because of this they recognize the difficulty of making friendly systems. They apologize, not unlike Stockholm syndrome patients, for the software creators' sloth and lack of humanitarianism. Apologists will often tell themselves that they have gained a great deal of power by thinking slightly more like machines in order to operate computers successfully. Sufficiently advanced apologists even go on to become programmers themselves (like me), perpetuating overcomplicated software (hopefully not like me).

Survivors, on the other hand, do not understand why computers make usability difficult and (rightly) feel they shouldn't have to learn complicated and arbitrary rules just to use computers. They're called survivors because they are able to live on the scraps of easy-to-use functionality that reside on the surface of software applications. They do not like computers because computers make life complicated for them.

Alan attacks the very idea of "computer literacy", indicating that the idea that some users "get it" (i.e., apologists) and some do not (survivors) is a symptom of a structural problem in the way developers create software: instead of writing software for people they write software for the hardware. The arbitrary rules of the hardware propagate up to the user interface and therefore require a significant understanding of computers in order to use correctly. That is the definition of computer literacy, and when you think of it that way it's completely artificial. If programmers spent more time designing software to make sense from a human interaction perspective, the term computer literacy would have never needed to exist.

Thankfully nowadays with the Web and smart phones and other tech that is trying to reach as many users as possible software companies are investing a lot more time and effort into proper interaction design: they're designing natural interfaces whose uses and functionality are obvious, which show users an appropriate amount of information instead of forcing them to remember things in their heads, automating as much as possible.

This makes survivors incredibly happy since now they can get a lot more work done without feeling frustrated (or, let's be honest, without feeling as frustrated — we're not perfect yet). Since you don't need to have computer knowledge to use newer systems, software products are used by the non-computer-literate population (which is the majority) and sales and satisfaction increase. It's incredibly good for these companies and it's fantastic that the trend is finally swinging the right way.

But how about the apologists? They tend to complain when usability advances are made in the applications they use the most. Remember the Office ribbon? It's a feature that Microsoft created after spending countless hours understanding how menus and toolbars are used and how they can be improved. Nested menus are objectively bad interaction design, which is why Vista dropped that feature from its start menu, and it's why the Office ribbon exists.

Yet if you ask most Office users what they think of the ribbon (at least if you ask the ones that I know) they'll tell you that the ribbon annoys them. The same was true when Google decided to remove the "http://" part of URLs in Chrome's address bar, or when Google introduced threaded conversations to e-mail with Google Mail, or why people complained when Windows XP grouped its Control Panel items instead of just listing them. I can go on: when established products improve their ease of use apologists tend to resist.

By definition, these people have invested a lot of time in learning computer systems in order to understand how to use them, and now the trend is to make software that can be used without that knowledge, thereby devaluing the knowledge they've accumulated. I'm sure this isn't the only factor, but to me it seems like a plausible reason for them to resist changes.

Software is changing for the better — gone are the days when you needed to know a programming language to run applications, gone are the days when installing and using software required thick manuals. We live in the survivor's era, where the best software is designed from the ground up with user satisfaction and understanding in mind. In this world the tables have turned: survivors are the target audience and apologists are resisting and complaining.

It's taking some of us too long to unlearn computer literacy and I guess we can get pretty cranky. We have a thing or two to learn from you survivors, since you guys were right all along. On behalf of all of us apologists, I'm sorry.