There is a very interesting contradiction between productivity of modern hardware and speed of modern software. My first PC had 700 MHz CPU, 64 MB RAM and Windows 98. It took 2-3 minutes to start the operating system. The laptop I'm writing this post on has a dual-core 2 MHz CPU, 2 GB RAM and Windows 7. And it takes the same 2-3 minutes to start up.
The other exmaple. I use Opera. It fits me well except one thing - its launching time. I can't imagine what a browser should do to start in about 20 seconds. But it does. Chrome is supposed to be the fastest browser but it also doesn't show its window immediately after launching. Hey guys, it's just a browser! It must start as fast as Windows notepad does! It should just open its window which usually doesn't have some complex GUI at startup. Maybe additionally it can load a bunch of favourite links but that's all. It just can't take so much time...
So the question is why doesn't the software become faster despite such a significant growth of hardware capabilities in last 10 years?
I think there are two reasons of this situation:
1. More powerful hardware set weaker requirements on programs.
2. Software development process usually should fit schedule and budget.
So you won't use char instead of int to store tiny numbers (of course if memory isn't a bottleneck). You won't cache linear search results because you have just hundred of items in a list. You will use Python instead of C because it's much easier to write programs with it.
So why do programmers do that? Because it takes less time and doesn't reasonbly hit productivity. To create more effective algorithm you gonna take more time and be paid more. But nobody needs the best effectiveness, satisfactory effectiveness is enough. Browsers and operating systems take much time to start because people load them once a day and don't shut down. If you deviate from this common pattern ... nobody cares :)