Geeky Reflections -- Simulated Annealing

When I was an undergrad, my interest was in interfacing microcomputers with mechanical devices.  Most of what we did would be labelled "robotics" today, or at least proto-robotics (e.g. ripping the ultrasonic rangefinder out of a Polaroid camera, putting it on a stepper motor, and trying to paint a radar image of the room on a computer screen).

In doing this, we were playing around with S-100 bus computers (PC's were a bit in the future at that point) and I got interested in brute force approaches to solving the traveling salesman problem.  The way this is done is to establish some random points in x,y space and then connect them with a random path and measure the length of that path.  The initial random path is obviously going to be a terrible solution.  So you have the computer randomly flip flop two segments, and then you see if the resulting total distance is reduced.  If it is, then you keep the change and try another.

This will lead to a much shorter path, but often will not lead to the optimally shortest path.  The reason is that the result can get stuck in a local minimum that is not the optimum.  Essentially, to break out of this, you have to allow the solution to get worse first before it can get better.

The approach I was playing with was called simulated annealing.  Everything I said above is the same in this approach, but sometimes you let the program accept flip-flopped segments that yield a worse (ie longer) rather than better path.  The allowed amount worse is governed by a "temperature" that is slowly lowered.  Initially, at high temperatures, the solution can jump into most any solution, better or worse.  But as the "temperature" is lowered, the allowed amount of jumping into worse solutions is reduced.  Essentially, the system is much, much more likely than the previous approach to settle closer to the actual optimum.  This is roughly an analog of how annealing works in metals.  The code is ridiculously simple.   I don't remember it being much more than 100 lines in Pascal.

Anyway, if you lived through the above without falling asleep, the payoff is this site.  After 30 years of pretty much never thinking about simulated annealing again, I found Todd Schneider's blog which has a great visual overview of solving the travelling salesman problem with simulated annealing.  If you really want to visually see it work, go to the customizable examples at the bottom and set the iterations per map draw for about 100.  Then watch.  It really does look a bit like a large excited molecule slowly cooling.  Here is an example below but check out his site.

1. Joe:

Very cool! Thanks for sharing. Kind of like parenting, you have to allow mistakes or the system will not learn. The magic involves how many mistakes, what magnitude of mistakes and how much help does the parent give putting the train back on the tracks.

2. JOrz:

Fascinating! I certainly didn't fall asleep. But then again, I'm definitely a geek. (I think you have a broken link in there)

4. Matthew Slyfield:

I work in IT. I do application development on a GIS (Geospatial Information System) as a contractor working for a utility.

If this guy really has a better mouse trap for solving shortest route problems, he is going to be obscenely wealthy.

5. J_W_W:

It's not P=NP (can a solution ever be?). but it's a very cool algorithm for low N values using the try solution/evaluate result method.

6. marque2:

Your description is a bit off. The Travelling Salesman problem is one of a set of problems referred to as NP complete. As the number of elements gets large it becomes impossible, even with all the computing power in the world to actually solve these problems. Traveling salesman is an N! factorial problem, where N is the number of destinations.

Since the computational power needed to find an exact solution, doesn't exist, you try to take shortcuts that allow the computer to calculate a reasonably close guess as to the best solution without being sure you actually have the best solution. What you and this fellow were trying to do is use Heuristics to find a reasonably close approximation of the best solution, without having to go through all the iterations. There are several strategies, the simplest being the Greedy algorithm (always choose the lowest cost remaining element. A* from artificial intelligence is also widely used (http://en.wikipedia.org/wiki/A*_search_algorithm), where if your path exceeds certain extremes you will start going down another path, until other paths become more extreme than the original one. One way to avoid poor false minimas (you are usually going to get a minima rather than the solution, since you are taking shortcuts) is to calculate the 10 best paths and choose at the end rather than searching for just one solution. This guy just came up with a different Heuristic, he isn't finding the solution.

Interestingly, computer scientists and mathematicians, would like to compute these problems in polynomial time. p^n, even if n is fairly large, in the long run is less intensive to compute than n!. This is called the NP Complete Problem. There is considerable research into whether n! problems can be done in p^n because it has never been proven that these problems can't be done in p^n. If someone discovers a way to do any of the NP Complete Problems, in polynomial time, due to mapping - all the NP Complete Problems, would be solvable in polynomial time. I know that seems a bit bizarre. (http://en.wikipedia.org/wiki/NP-complete) Unfortunately, it is believed that a proof saying you can or can't complete these in polynomial time will not be forthcoming, because it would already have been discovered.

As Matthew Slyfield mentioned, if this guy actually did solve the NP Complete Problem, he is likely to become quite famous, however, as I mentioned before, it looks like he came up with another heuristic, or approximation of the solution.

Note, under certain circumstances, and or with special tricks, you can actually get fairly large exact answers, but then if you figure out a way to cut the computation by say 1 million times, n!/1million, is ultimately trivially less time, than plain brute force n!. (http://en.wikipedia.org/wiki/Travelling_salesman_problem)

7. marque2:

must be a failure of an annealing routing algorithm :P

8. Eric Wilner:

Interesting!
I'd encountered simulated annealing back when Xilinx was new, and the place & route software for their FPGAs used annealing. If memory serves, it had cool visuals back then, to show the process at work. (And it appears that their latest and greatest software has gone back to using annealing!)

9. Matthew Slyfield:

"As Matthew Slyfield mentioned, if this guy actually did solve the NP
Complete Problem, he is likely to become quite famous, however, as I
mentioned before, it looks like he came up with another heuristic, or
approximation of the solution."

I agree with you that this is likely not a complete the one true optimal answer solution, and instead is another heuristic. However, the developer can still end up becoming obscenely wealthy if his heuristic is either generally more efficient or generally comes closer to "the one true optimal answer".

How do these methods differ from genetic algorithms?

Warren,

12. Zach:

As the great MC++ rapped, "if we could factor large composites in poly time/we'd have enough money to not have to rhyme."

Integer factorization is another NP Complete problem. If someone were to develop a general purpose solution to integer factorization, it would be isomorphic to the travelling salesman problem. It would also likely end public key encryption as we know it.

13. obloodyhell:

yup. not working for me, either

14. obloodyhell:

}}} Here is an example below but check out his site.

Seems as thought it's off the beaten path... :-<

15. Urszula Koczyńska:

I can't fire that link :( podrozepousa.com.pl