> Suppose you want to visit every state capitol exactly once, traveling on highways. Which route is shortest? Which route is longest? What is the mean and standard deviation of all the routes?
So these “BDDs” allow for “efficient” solutions to the traveling salesman problem? Call me a skeptic, but I’m skeptical...
As with any typical problem (NP-hard or otherwise), very large instances are infeasible to solve, and very small instances can be quickly solved. What's interesting is the in-between region: instances that can be solved within a human lifetime, but where efficient algorithms and data structures can often make the difference between getting the answer in days (or months) versus seconds.
For example, the first computation of the exact number of closed knight tours on an 8x8 chessboard was done in 1996 using BDDs. (Their program actually had a bug which they corrected later and in the meantime another method gave the correct answer, but the method itself was sound.)
It's indeed true that the travelling salesman problem (TSP) is a “hard problem” in the sense of being NP-complete. But this implies only that no algorithm is known whose running time grows polynomially as a function of the input size (as the size goes to infinity). The Concorde TSP Solver for instance has solved instances with as large as 85,900 vertices, and instances with about 50 vertices can be solved quickly even with simple branch-and-bound heuristics.
In this case (all the numbers in this paragraph are from Knuth's treatment in Volume 4A of TAOCP), the graph of the contiguous US states has 48 vertices and (it turns out each state has on average only a little over 4 neighbours) 105 edges. A Hamiltonian path is some subset of these edges that covers each vertex exactly once. What BDDs/ZDDs enable is to succinctly encode which of these 2^105 subsets are Hamiltonian paths—turns out it can be done with a data structure having 28808 nodes. Once you have constructed this data structure (ZDD), you can now answer many questions about this entire set of Hamiltonian paths (more quickly than you could by backtracking or similar): how many of them are there? (68,656,026.) How many are there that end in California? (2,707,075.) Which one is shortest (this is the standard TSP problem, for which there are many other methods) (11,698 miles), and which one is longest (18,040 miles)? What is the lexicographically first one (according to some order)? Can you quickly sample a path at random (such that each path is equally likely to be picked)? Etc.
For arbitrary worst-case input (which is what the NP-hardness of TSP is about), the corresponding BDD/ZDD may turn out to be too large, so we cannot do any of this efficiently. The magic is that in many cases of practical interest, we happen to get small BDDs. Knuth's treatment (57 pages of text, 22 of exercises, 58 of solutions) ends with a caveat that among other things points out:
> They apply chiefly to problems that have more solutions than can readily be examined one by one, problems whose solutions have a local structure that allows our algorithms to deal with only relatively few subproblems at a time.
As Knuth mentions, the 1986 paper that (re)introduced BDDs (ordered and reduced, as the term generally means today) was for many years the most-cited paper in computer science according to CiteSeer. It's very readable and mentions the advantages and disadvantages: https://www.cs.cmu.edu/~bryant/pubdir/ieeetc86.pdf
So these “BDDs” allow for “efficient” solutions to the traveling salesman problem? Call me a skeptic, but I’m skeptical...