Note: the primes up to 64K are only the lowest level a three-level operation for sieving up to 2^64, the factors of the factors as it were. And each level gets considerably more complicated to code and verify (even 'just' a reference implementation). I've already discussed a few issues regarding the verification of prime sieves over on Stack Overflow last year:
- How can one verify the proper operation of a sieve close to 2^64?
- Checksumming large swathes of prime numbers? (for verification)
Preempting an objection that hasn't been raised yet - the class could and should compare prime count, sum and product against expected values just like the file vetting script linked above does.
assert_equal("count", 6542, Primes.Length);
var sum = Primes.Select((p) => (int)p).Sum();
assert_equal("sum", 202288087, sum);
var primorial_64bit = Primes.Select((p) => (ulong)p).Aggregate((product, prime) => product * prime);
assert_equal("product", 8987519195527561682UL, primorial_64bit);
Note: the primes up to 64K are only the lowest level a three-level operation for sieving up to 2^64, the factors of the factors as it were. And each level gets considerably more complicated to code and verify (even 'just' a reference implementation). I've already discussed a few issues regarding the verification of prime sieves over on Stack Overflow last year:
- How can one verify the proper operation of a sieve close to 2^64?
- Checksumming large swathes of prime numbers? (for verification)
Preempting an objection that hasn't been raised yet - the class could and should compare prime count, sum and product against expected values just like the file vetting script linked above does.
assert_equal("count", 6542, Primes.Length);
var sum = Primes.Select((p) => (int)p).Sum();
assert_equal("sum", 202288087, sum);
var primorial_64bit = Primes.Select((p) => (ulong)p).Aggregate((product, prime) => product * prime);
assert_equal("product", 8987519195527561682UL, primorial_64bit);
Another consideration: a reference implementation remains verified only as long as the verified binary remains unmodified - that is, until someone hits the 'build' button again, at which point there's new binary that requires verification. Add to that the fact that I've uncovered (and reported) several compiler bugs in each of the compilers I used extensively, starting with MS VC++, Borland C++ and Delphi, and a whole slew of them in FoxPro. And there are plenty more ways how things can go wrong, even without compiler bugs (starting with stale object files/libraries). In that light a local, write-protected file is a whole lot simpler and inspires a whole lot more trust.
Another consideration: a reference implementation remains verified only as long as the verified binary remains unmodified - that is, until someone hits the 'build' button again, at which point there's new binary that requires verification. Add to that the fact that I've uncovered (and reported) several compiler bugs in each of the compilers I used extensively, starting with MS VC++, Borland C++ and Delphi, and a whole slew of them in FoxPro. And there are plenty more ways how things can go wrong, even without compiler bugs (starting with stale object files/libraries). In that light a local, write-protected file is a whole lot simpler and inspires a whole lot more trust.