February 5, 2018, 1:06 pm
Is there some sort of metric for the complexity of a boundary like this one for PA Distric 7, among those invalidated in PA (in a decision the Supreme Court today refused to review)?
I keep wondering whether there is an objective standard we can set, rather just the sort of "I know it when I see it" one that everyone seems to use.
One way I can imagine testing is to do it Monte Carlo style by drawing a series of lines from one random point in the shape to another random point in the shape and calculating which percentage of the lines cross the boundary at least once. That metric for a circle or a rectangle would be zero, but would be very high for this shape.
February 1, 2008, 9:21 am
The Reference Frame has a video of a dog solving the traveling salesman problem. I was doing some simulations years ago for a railroad company and actually had a traveling-salesman-like problem to solve with equipment routing. The best approach I found was simulated annealing. This algorithm starts out with a totally random solution, and then applies random swaps of route legs and then checks to see if the new route is better or worse than the old route. So far, similar to any Monte Carlo approach. But in this algorithm, the solution is allowed to jump to worse solutions, though the size of this jump is reduced over time as the algorithm is run. This helps prevent the algorithm from getting stuck in local minima.
It is called simulated annealing because it is very parallel to the process of cooling and crystallization in a piece of steel. When heated steel is plunged into water and cooled quickly, the molecules crystallize and are trapped in a higher energy state, whereas cooling the steel slowly lets the structure stabilize into a much lower energy state. Metal that is quench cooled is harder but more brittle, metal that is annealed is softer and more ductile. In the algorithm, the slow reduction in temperature is represented by the declining amount by which the algorithm can jump to a worse solution.