Recently I had the opportunity to meet someone online who hates programmers.

Seriously, he hates programmers. I had trouble understanding why at first. It seemed to me like he was taking the common "Microsoft sucks" attitude one step farther than normal. He extended all of the little issues he has with software to programmers in general. This isn't completely unjustified. Programmers are the root cause of his issues, after all.

After I spent a few minutes arguing with him, trying to understand his point of view, he mentioned in passing a complaint that Vista was an example of poor programming because of how much space it takes up on his hard drive. His reasoning was that smaller programs are faster, because to him this was obvious. As he stated this I realized that this argument, though completely incorrect, seems obvious to people who don't have a background in computer science.

Not only is the idea that smaller programs are faster incorrect, but in fact a lot of the time the exact opposite is true: the more space a program takes up, the more responsive it is. Now, this idea is not always wrong. Sometimes smaller programs are faster than their larger counterparts. But I would say that most of the time it's wrong. Let's look at some examples.

First, let's look at file compression. Most people are familiar with ZIP files: you can add many files to a ZIP archive and the result will be smaller than the total size of the original files. The program (the ZIP algorithm) is using less space on the hard disk than the original files. But it takes time to put the original files into the ZIP archive. The algorithm trades time for space; anyone who has had to compress very large files knows that it can take tens of minutes to make a ZIP file. Intuitively, you are spending more time to get a smaller result.

The opposite is also true. Most people are familiar with searching. Be it Internet or desktop searching, they take up huge amounts of space in order to make searching faster. Remember back in the Windows 9x days when trying to search for a file on your PC took tens of minutes? That's because the search algorithm went to every individual file, looked at it and decided whether it matched your criteria. But now in Vista, your search results return in moments. That's because Vista keeps what's called an index of all your files. An index is like a big dictionary where every word you could type into a search bar is present, and all of the files matching those words are listed next to them. When I say big dictionary, I mean huge. All of the information in the index is already technically in the files themselves, but Vista stores that same information again in a more searchable way so that you can get your results faster.

So there's two common situations in which bigger equates to faster. If you dive into more sophisticated algorithms, you'll find this trade off all over the place. Every computer scientist knows this, because it is a very rudimentary concept for us. We learn it right at the outset of programming and it stays with us forever. So we make software that takes up more and more space so that it can be more and more responsive to the user. But users see this extra space as wasteful, and they hate us for it.

So what can we do about it? I think all we can really do is try to make the relationship between time and space more obvious. One way would be to come up with an example that everyone can understand that clearly shows that storing extra information can lead to faster algorithms, and refer to this example every time it comes up. As a mathematician, I think division would serve as an adequate example:

Say you have two numbers that you want to divide, like 25 and 7. You want the result today, and every Saturday from now on. So you calculate 3.571428 the first time by hand, and it takes you three minutes to do it. Now you have a choice. You know you're going to use the result of this calculation next week. Do you take the time to write down the result now (which will take up space on your page), or do you trust that you will take three minutes recalculate it next week (which will take up time next week, but will save space)? That's the time/space tradeoff.

This example is helpful because it demonstrates the fact that the value has to be calculated at least once the first time. This first-time calculation is the reason modern applications take more time to start up than their older counterparts, even on faster computers. It's because they're precalculating all kinds of nice things for you to make the remaining experience faster. A little slowdown at the beginning will save you a lot of time overall.

But the example-based approach is reactive: you can give someone this example if only he's already got the idea that large implies slow. It would be better to make this relationship more obvious before they ever get the wrong idea. I don't have a solution to this one.

What do you think? If we don't make users aware that applications can take up more space in order to provide a faster experience, we as programmers are going to be hated by our customers for giving them what they want: fast, feature-rich software. How can we do it?