It's not really clear what this code was trying to do. Write i and o
for the initial values of r->ibits and r->obits, respectively, i' and 'o
for their respective final values, and O for RAND_OBITS. In the case
that i + o <= O, we update i' = 0 and o' = i + o, maintaining the
invariant that i' + o' = i + o. But if i + o > O, then we set o' = O and
i' = (i + o) - i = o, which seems nonsensical. In particular, in the
case that i = 1 and o = O, it apparently magics O - 1 bits of entropy
from nowhere.
Modify the code so that it at least maintains the sum of the entropy
counters in either branch. I'm not sure this is actually correct, but
it seems like a defensible position.