Some Geek Love
One of the geekier conversations I used to get drawn into (up there along with arguing about favorite Serenity episodes, lamenting the demise of Omni magazine, and coming to blows over D&D rules interpretations) was over the relative merits of various sorting algorithms. Flowing Data has some links to some interesting visualization approaches to sorting algorithms. This one is for quicksort (colors start out random on the left and must be sorted into Roy G Biv order).
In college I did a project on solving traveling-salesman-type problems with an algorithm called simulated annealing. Many approaches to the traveling salesman problem pick a random path and then randomly switch pairs of routings on the path, and then stick with the alternative that gives the best score (in this case the shortest path). Rinse, repeat a zillion times. The problem is this approach can get stuck in, or converge on, a local optima that are not as good as the single grand optimum for the problem.
In simulated annealing, the algorithm is allowed to sometimes jump to a worse (ie longer) path, which lets it jump out of local minima. The amount of the backwards jump that is allowed is slowly reduced over time as the algorithm runs. It is called simulated annealing because this is very similar to the annealing process in metals, where temperature is decreased slowly to, initially, allow the metal molecules to jump to higher energy states so that the whole can settle into a more homogeneous structure.
Anyway, we tried to show how the algorithm proceeded to a solution over time and the visualizations looked a little like this.
Jesse:
For more geekiness on that website, check out how he actually sorts the colors. You can see it's not a perfect rainbow. He uses a 3D Hilbert space on an RGB cube.
July 26, 2010, 11:55 amSukrit:
/Firefly/ episodes, you mean /Firefly/ episodes :D.
July 26, 2010, 12:38 pmAaron:
I was actually just watching today an MIT lecture series about algorithms that focuses on sorting algorithms in the first part of the course. They're a great instructional tool for showing how and why some algorithms take longer than others to perform the same task, thereby introducting big-O, theta, and omega notation along with a whole host of other cool concepts at the same time. Incidentally, I was inspired by that lecture to home brew quick implementations of the insertion sort and merge sort algorithms in Python and test them on a pretty old machine at work on a randomly generated list of 100,000 integers. The results, merge sort (O(n log n) for the nerds) took 12 seconds, and insertion sort (O(n^2)) took... 20 minutes!
As for other algorithms, simulated annealing is pretty neat but by far the coolest algorithm I have seen that was designed as an analogy to some other natural process is the genetic algorithm. Other machine learning algorithms may be more efficient and/or more accurate (at the moment anyway), but how awesome is a program that adapts its own programming on the fly? Even though it's backed by the same statistics that power more "boring" ML algorithms, it just seems more "living."
July 27, 2010, 4:44 pm