I have a Perl subroutine that automatically minifies and caches JavaScript and CSS files. To get around browser caching issues, it puts the current Epoch Time before the name of the cached file that is served to the browser, e.g. 1625006582-weather.css
.
My initial plan was to delete any old copies using unlink </path/to/cache/*-weather.css>
immediately before creating the new cache. However, that means there is a moment, ever so slight as it may be, that no copy is publicly available (longer, if the creation of the new version fails).
So, what I would like to do is figure out the most efficient way to later on clean up in my maintenance scripts, by deleting *-weather.css
minus the most recent copy. I suppose I could iterate through the directory and then unlink one at a time until I reached the last match, but somehow I suspect there is a more efficient way to do it. Is there?
chdir($fullpath);
@files = glob("*-weather.css");
for (my $I; $I++; $I > $#files) {
unlink ($fullpath . $files[$I]);
}
There's no guarantee that the latest file is the first file returned by glob
.
use Sort::Key::Natural qw( natsort );
my @qfns = natsort glob("*-weather.css");
pop(@qfns); # We don't want to delete the newest.
unlink(@qfns);