chiark / gitweb /
An entirely different search program.
This one does not rigorously attempt to find the _best_ dissection,
because it doesn't try any dissection that has a common denominator
larger than n. It works by narrowing down the possible ways to cut
each stick, constructing an ILP problem whose variables are 'how many
n-sticks / m-sticks are cut _this_ way?' and whose constraints
represent the need to have the same number of pieces of each size at
each end of the dissection, and feeding it to lp_solve.
So it won't find the best dissection if the best dissection has a
denominator larger than n. Empirically, this seems to have been true
so far, but I have no proof that the denominator is bounded like this
in all cases (though 5 into 3 demonstrates that the denominator can at
least exceed _m_).
It runs much faster than the other searcher, though! So I might use it
to try to push further at the expense of certainty. (And writinghawk's
upper bound technique will hopefully show at least _some_ of its
results to be optimal.)