I'm studying for the computer science Ph.D. qualifying exam and so I've started going back through my algorithms book (Intro to Algorithms by Cormen, Leiserson, Rivest, and Stein).  The first chapter was, of course, about motivating the study of algorithms.  One exercise that made an impression on me was the one that had you generate a table giving the largest problem you could solve in different amounts of time given different asymptotically complex algorithms.

Since I take every opportunity to make progress learning Haskell, I coded it up:

What the table shows is the largest value of n you could process given an algorithm of certain complexity and a certain amount of time:

f(n) 1 sec. 1 min. 1 hour 1 day 1 month 1 year 1 century
lg(n)2.7e43∞*∞*∞*∞*∞*∞*
sqrt(n)100003.6e81.3e117.5e136.7e169.7e189.7e22
n10060003.6e58.6e62.6e83.1e93.1e11
n lg(n)29884344586.5e51.6e71.6e81.3e10
n^21077600293916099557705.6e5
n^34176819459713576203
2^n6121823273138
n!47810111214

* The values here aren't really infinity but they are over 300 digits!

So the lesson here?  Having an algorithm with a good asymptotic complexity makes a huge difference in the amount of data it is feasible to process.  Just look at the difference between linear complexity (n) and logarithmic complexity (lg n): 41 orders of magnitude!