Thursday, July 15, 2010

Useless (but nifty) Ruby code

Here's a nifty little piece of useless Ruby code:

and the result is:

require "quine"this.that.and.something_else

Monday, July 12, 2010

coded_options: A new Ruby gem for coded fields

I recently started a couple new projects (Rails 3 + mongoid) and I've noticed a pattern in the way I handle coded fields.  My key example is something like the following: you have a field, say status, that can have several values, say active, closed, and invalid.  Obviously you could store those as strings in the database or you can code them, say 0, 1, and 2.  Normal database practice is to code them as integers to save space but a far more important concern is that clients change their mind about what you call things.  So its much easier to just change the string values in some code than it is to go changing every string in the database.

Anyway, the usage is something like this (yanked directly from the README):

Here line 4 (the coded_options call) basically gets mapped into lines 6 through 14.  Nothing spectacular but it's really cleaned up my code quite a bit so maybe it will be useful to some other folks.  The code is up on github (http://github.com/jasondew/coded_options) and the gem is on gemcutter (http://rubygems.org/gems/coded_options).

Friday, July 9, 2010

The importance of a good algorithm

I'm studying for the computer science Ph.D. qualifying exam and so I've started going back through my algorithms book (Intro to Algorithms by Cormen, Leiserson, Rivest, and Stein).  The first chapter was, of course, about motivating the study of algorithms.  One exercise that made an impression on me was the one that had you generate a table giving the largest problem you could solve in different amounts of time given different asymptotically complex algorithms.

Since I take every opportunity to make progress learning Haskell, I coded it up:

What the table shows is the largest value of n you could process given an algorithm of certain complexity and a certain amount of time:

f(n) 1 sec. 1 min. 1 hour 1 day 1 month 1 year 1 century
lg(n)2.7e43∞*∞*∞*∞*∞*∞*
sqrt(n)100003.6e81.3e117.5e136.7e169.7e189.7e22
n10060003.6e58.6e62.6e83.1e93.1e11
n lg(n)29884344586.5e51.6e71.6e81.3e10
n^21077600293916099557705.6e5
n^34176819459713576203
2^n6121823273138
n!47810111214

* The values here aren't really infinity but they are over 300 digits!

So the lesson here?  Having an algorithm with a good asymptotic complexity makes a huge difference in the amount of data it is feasible to process.  Just look at the difference between linear complexity (n) and logarithmic complexity (lg n): 41 orders of magnitude!