I have a number of IIS servers, each with a number of sites on them and I want to zip all IIS logs regularly.
I cobbled together the following powershell script with the help of this site and google:
$files = Get-ChildItem "D:\logfiles\IIS-Logs\*.log" -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-7))}
foreach ($file in $files) {& 'C:\Program Files\WinRAR\winrar.exe' a -tl -df -m5 "$file.rar" $File}
The problem with this script is that if there are.. say... 2,000 total log files it tried to launch 2,000 simultaneous copies of Winrar and the server will crash. This was unexpected. I expected it to zip the files one at a time. Sequentially.
Does anyone have any ideas to make this work like I want?
I'd really like to use Winrar vs the native Compress-Archive option because:
I'm not married to Winrar if I can achieve this another way.
You probably want to use "Start-Process -Wait" instead of using the &. Using the -Wait flag on Start-Process forces it to wait for completion, and will then cause your program to run sequentially. Check this article on how to use Start-Process: A Better PowerShell Start Process You might also want to use the cmdlet Compress-Archive instead of the command line program Winrar, which might be better integrated and give you better feedback in your scripting.
Something like this with WinRAR?
$files = Get-ChildItem "D:\logfiles\IIS-Logs\*.log" -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-7))}
foreach ($file in $files) {
$args = 'a -tl -df -m5 "' + $file.rar + '" ' + $File
Start-Process -Wait -filepath 'C:\Program Files\WinRAR\winrar.exe' -ArgumentList $args
}
Or this with Compress-Archive
$files = Get-ChildItem "D:\logfiles\IIS-Logs\*.log" -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-7))}
foreach ($file in $files) {
Compress-Archive -Path $file.FullName -DestinationPath "$($File.FullName).rar"
{
The above are untested, but should work unless I made a typo.