I have written a Perl script (Strawberry) to fill up some old hard drives with some semi-random numbers a couple of times before sending them to recycling.
The script creates a ~ 100mb file with random 0's and 1's which is then written 10 times into a file.
This process is then repeated depending on hard disk size (in the example only 3 times for testing purposes).
Currently, its taking around 36 seconds per file.
My question is: what changes can I make to the script to speed this up?
Here is the script in full:
my $master = "c:/programme/tools/filluphd/master.txt";
my $rd;
my $log = "c:/programme/tools/filluphd/log.log";
my $file = "f:/file_";
#create master
open (MASTER, ">", $master);
for ($i = 0; $i < 100000000; $i++)
{
$rd = int(rand(2));
print MASTER $rd;
}
close (MASTER);
open (LOG, ">", $log);
print LOG "start: ".localtime()."\n";
my $start_time = time();
my $random;
open (MASTER, "<", $master) or die "cannot open master: $!"."\n";
while (<MASTER>)
{
$random .= $_;
}
close (MASTER);
for ($n = 0; $n <= 2 ; $n++)
{
my $file_start = time();
my $filename = $file.$n.".txt";
my $i = 0;
open (FILE, ">", $filename);
for ($i; $i <= 10; $i++)
{
print FILE $random;
}
close (FILE);
my $file_end = time();
my $file_runtime = $file_end - $file_start;
print LOG "file ".$filename." took ".$file_runtime." seconds!"."\n";
}
my $end_time = time();
my $run_time = $end_time - $start_time;
print LOG "run time: ".$run_time." seconds"."\n";
print LOG "finish: ".localtime()."\n";
close (LOG);
I did some measuring and found out that the Case the HD is in is USB2-only. That means with about 30MB/s it is completely saturated and the windows tools show this. Speedwise, there is not much else to do.