Here's a nifty little piece of useless Ruby code:

and the result is:

require "quine"this.that.and.something_else

Here's a nifty little piece of useless Ruby code:

and the result is:

require "quine"this.that.and.something_else

I recently started a couple new projects (Rails 3 + mongoid) and I've noticed a pattern in the way I handle coded fields. My key example is something like the following: you have a field, say status, that can have several values, say active, closed, and invalid. Obviously you could store those as strings in the database or you can code them, say 0, 1, and 2. Normal database practice is to code them as integers to save space but a far more important concern is that clients change their mind about what you call things. So its much easier to just change the string values in some code than it is to go changing every string in the database.

Anyway, the usage is something like this (yanked directly from the README):

Here line 4 (the coded_options call) basically gets mapped into lines 6 through 14. Nothing spectacular but it's really cleaned up my code quite a bit so maybe it will be useful to some other folks. The code is up on github (http://github.com/jasondew/coded_options) and the gem is on gemcutter (http://rubygems.org/gems/coded_options).

I'm studying for the computer science Ph.D. qualifying exam and so I've started going back through my algorithms book (Intro to Algorithms by Cormen, Leiserson, Rivest, and Stein). The first chapter was, of course, about motivating the study of algorithms. One exercise that made an impression on me was the one that had you generate a table giving the largest problem you could solve in different amounts of time given different asymptotically complex algorithms.

Since I take every opportunity to make progress learning Haskell, I coded it up:

What the table shows is the largest value of n you could process given an algorithm of certain complexity and a certain amount of time:

f(n) | 1 sec. | 1 min. | 1 hour | 1 day | 1 month | 1 year | 1 century |
---|---|---|---|---|---|---|---|

lg(n) | 2.7e43 | ∞* | ∞* | ∞* | ∞* | ∞* | ∞* |

sqrt(n) | 10000 | 3.6e8 | 1.3e11 | 7.5e13 | 6.7e16 | 9.7e18 | 9.7e22 |

n | 100 | 6000 | 3.6e5 | 8.6e6 | 2.6e8 | 3.1e9 | 3.1e11 |

n lg(n) | 29 | 884 | 34458 | 6.5e5 | 1.6e7 | 1.6e8 | 1.3e10 |

n^2 | 10 | 77 | 600 | 2939 | 16099 | 55770 | 5.6e5 |

n^3 | 4 | 17 | 68 | 194 | 597 | 1357 | 6203 |

2^n | 6 | 12 | 18 | 23 | 27 | 31 | 38 |

n! | 4 | 7 | 8 | 10 | 11 | 12 | 14 |

* The values here aren't really infinity *but* they are over 300 digits!

So the lesson here? Having an algorithm with a good asymptotic complexity makes a huge difference in the amount of data it is feasible to process. Just look at the difference between linear complexity (n) and logarithmic complexity (lg n): 41 orders of magnitude!

Subscribe to:
Posts (Atom)