On Thu, 2007-08-30 at 17:12 -0400, Bob Johnson wrote:
> But an audio CD will run out of data before you fill a 1GB file. Region
> encoding tries to get in the way with a DVD.
> For repeated use, I would be inclined to create a DVD full of data from
> urandom, then use dd to pull data off of it with various offsets to create
> test files. Or you could copy a few music CDs to one big file that you burn
> to your DVD to create a collection of semi-random data for test files.
I would shake my head ruefully and how you guys are overthinking this,
and just use /dev/urandom, because that's what it's there for and it's
not like /dev/random stores infinite entropy forever if you just leave
The default size of /dev/random's entropy pool is 512 bytes. It doesn't
take long to renew that. Unless you're doing constant ultra-secure
cryptography like streaming intelligence intercepts or something, you
really shouldn't concern yourself with the effects of
pulling /dev/urandom to your heart's content. I guarantee you, there
are few purposes short of a one-time-pad generation where a gig
of /dev/urandom won't be statistically indistinguishable from random.