Thanks. It wasn't clear what the desire was based on the text you
included with your message.
Defining the block size definitely makes an impact on speed
dd if=/dev/cdrom of=file1 bs=4096 count=254592
dd if=/dev/cdrom of=file1 bs=1 count=1042808832
On Aug 30, 2007, at 12:28 PM, Raymond Page wrote:
> He wanted random data. If I wanted to generate a large file, I would
> do it your way too. However a random filled file seemed to be the
> goal. Using an audio cd guarantees some pretty random data
> (relatively to the sorts of noise various music can have) and it costs
> nothing from /dev/random or /dev/urandom.
> On 8/29/07, John H. Sawyer <[log in to unmask]> wrote:
>> Ick. /dev/zero is so much quicker and even faster with a larger
>> block size.
>> dd if=/dev/zero of=file1 bs=4096 count=254592
>> 254592+0 records in
>> 254592+0 records out
>> 1042808832 bytes (1.0 GB) copied, 21.6826 seconds, 48.1 MB/s
>> On Wed Aug 29 20:57:25 EDT 2007, Raymond Page <[log in to unmask]>
>>> What about just popping in your favorite music cd and pulling
>>> data from that?
>>> Personally I'd use dd to get the right file size (if block size
>>> take you over, make it short and then append)
>>> On 8/6/07, Edward Allcutt <[log in to unmask]> wrote:
>>>> On Mon, 2007-08-06 at 08:51 -0400, Allen S. Rout wrote:
>>>>> So, for debugging purposes (long story) I want to create a
>>>> couple of
>>>>> files which are precisely
>>>>> bytes long. We were musing about optimal ways to do this.
> Raymond Page