I've been using the following piece of code to get the disk usage starting at a specific directory:
$usage = (Get-ChildItem $webRoot -recurse | Measure-Object -property length -sum)
$diskUsage = $usage.sum
The problem is that this takes a VERY long time compared with simply right clicking a directory in windows and looking at the properties.
Do we have access to the functionality that explorer uses to get disk usage? Or is there another way that will be faster than the method I've been using?
I'm also not sure what will happen if there are circular links in the area my PS code is searching. I assume it could lock up.
Even when you right click and view properties Explorer still has to calculate the size, that's what happens for me on my network drive. It takes a while because the are 1,000,000+ files and 500,000+ directories.
Using the command you supplied it first gets all the files and folders and stores them to memory. Just doing $files = gci $path -r
takes a long time because there are so many files. Then, your command passes the big memory chunk to Measure-Object
. Finally, Measure
does the calculation. So you're waiting for a while with no progress.
What you can do is use a Foreach-Object %
loop to process the size, and have it output what it already has calculated. Printing text to the screen always slows things down but at least you are seeing progress. Remove the cls; "Total: $($total/1GB)GB"
for more efficiency.
gci $path -file -r | % {$total += $_.length; cls; "Total: $($total/1GB)GB"}
Run a Measure-Command
on your command and the other commands given as answers and see which one is the fastest.