So I have a script that gets the filename of songs contained in a CSV list and checks a directory to see if the file exists, then exports the missing information if there is any. The CSV file looks something like this:
Now, my script seems to work when I test on a smaller directory but when I run it against my actual directory contained on an external drive (about 10TB of files), I get a "system.outofmemoryexception" error before the script can complete.
$myPath = 'Z:\Music\media'
$myCSV = 'C:\Users\Me\Documents\Test.csv'
$CSVexport = 'C:\Users\Me\Documents\Results.csv'
$FileList = Get-ChildItem $myPath -Recurse *.wav | Select-Object -ExpandProperty Name -Unique
Import-CSV -Path $myCSV |
Where-Object {$FileList -notcontains $_.Filename} |
Select ID, AlbumTitle, TrackNo, Filename | Export-CSV $CSVexport -NoTypeInformation
$missing = Import-CSV $CSVexport | Select-Object -ExpandProperty Filename
If(!([string]::IsNullOrEmpty($missing))){
Write-Output "Missing files:`n" $missing}
Is there a way to make this script consume less memory or a more efficient way to do this against a large directory of files? I am new to Powershell scripting and am having trouble finding a way around this.
When @TheIncorrigible says iteratively he means something like this. Please note I am using different file paths since I don't have Z: drive. The best way would be to load up your csv items in a variable then iterate through that variable using a foreach loop, then for each one of those items testing to see if file exist, then if it does not add that item to a new variable. Once complete then export the new variable containing the missing items to csv.
$myPath = "C:\temp\"
$myCsv = "C:\temp\testcsv.csv"
$CSVexport = "C:\temp\results.csv"
$CsvItems = Import-Csv -Path $myCsv
$MissingItems
foreach($item in $CsvItems)
{
$DoesFileExist = Test-Path ($myPath + $item.Filename)
If($DoesFileExist -eq $false)
{
$MissingItems = $MissingItems + $item
}
}
$MissingItems | Export-Csv $CSVexport -NoTypeInformation