From: Mark Wooding Date: Thu, 26 May 2016 08:26:09 +0000 (+0100) Subject: math/pfilt.c (pfilt_jump): Fix off-by-one error in reduction. X-Git-Tag: 2.2.4~13 X-Git-Url: http://www.chiark.greenend.org.uk/ucgi/~mdw/git/catacomb/commitdiff_plain/a28748f8e0b2a820666b72f4a022cba19ba70c51 math/pfilt.c (pfilt_jump): Fix off-by-one error in reduction. Oh, dear. This is quite a bad one. The loop added the residues for the jump to the candidate, and reduced each if the result was strictly higher than the modulus. It then reports failure (or immediate success) if any residue is zero, otherwise reporting a candidate for subsequent testing. Obviously, this is a stupid bug, with the result that, effectively, every step reports a candidate for further testing. This bug has two bad consequences. * Candidates with small factors aren't weeded out, so prime searching takes an unnecessarily long time. I'd spotted this, but didn't have a way in to investigate the problem. * Candidates which actually have small factors, but are in fact below the `smallenough' threshold, are reported as being verified as prime, so the overall procedure erroneously returns known composites. --- diff --git a/math/pfilt.c b/math/pfilt.c index de049cbe..a4c9856b 100644 --- a/math/pfilt.c +++ b/math/pfilt.c @@ -304,7 +304,7 @@ int pfilt_jump(pfilt *p, const pfilt *j) for (i = 0; i < NPRIME; i++) { p->r[i] = p->r[i] + j->r[i]; - if (p->r[i] > primetab[i]) + if (p->r[i] >= primetab[i]) p->r[i] -= primetab[i]; if (!p->r[i] && rc == PGEN_TRY) { if (MP_LEN(p->m) == 1 && p->m->v[0] == primetab[i])