md: Fix hashing for data >= 256 GB
* cipher/hash-common.h (gcry_md_block_ctx): Add "nblocks_high". * cipher/hash-common.c (_gcry_md_block_write): Bump NBLOCKS_HIGH. * cipher/md4.c (md4_init, md4_final): Take care of NBLOCKS_HIGH. * cipher/md5.c (md5_init, md5_final): Ditto. * cipher/rmd160.c (_gcry_rmd160_init, rmd160_final): Ditto. * cipher/sha1.c (sha1_init, sha1_final): Ditto. * cipher/sha256.c (sha256_init, sha224_init, sha256_final): Ditto. * cipher/sha512.c (sha512_init, sha384_init, sha512_final): Ditto. * cipher/tiger.c (do_init, tiger_final): Ditto. * cipher/whirlpool.c (whirlpool_final): Ditto. * cipher/md.c (gcry_md_algo_info): Add GCRYCTL_SELFTEST. (_gcry_md_selftest): Return "not implemented" as required. * tests/hashtest.c: New. * tests/genhashdata.c: New. * tests/Makefile.am (TESTS): Add hashtest. (noinst_PROGRAMS): Add genhashdata
Problem found by Denis Corbin and analyzed by Yuriy Kaminskiy.
sha512 and whirlpool should not have this problem because they use 64
bit types for counting the blocks. However, a similar fix has been
employed to allow for really huge sizes - despite that it will be very
hard to test them.
The test vectors have been produced by sha{1,224,256}sum and the
genhashdata tool. A sequence of 'a' is used for them because a test
using one million 'a' is commonly used for test vectors. More test
vectors are required. Running the large tests needs to be done
manual for now:
./hashtest --gigs 256
tests all algorithms,
./hashtest --gigs 256 sha1 sha224 sha256
only the given ones. A configure option to include these test in the
standard regression suite will be useful. The tests will take looong.
- Signed-off-by: Werner Koch <wk@gnupg.org>