bashdelete-filebackup-strategies

How to make this bash script NOT delete all my files?


I have a cron job, every 5 minutes, backing up my MYSQL to files ending in .sql.gz. But this is hundreds of files a day. So I searched the internet and found this bash script which I expected to just work on the files in the /backup folder specified and only on .sql.gz files. but I soon found that it deleted everything in my root folder. :-) I was able to FTP the files back and have my site back up in half an hour, but I still need the script to work as intended. I'm new to bash scripting so I'm asking what did I do wrong in editing the script I found on the internet to my needs? What would work?

Here is the rogue script. DO NOT run this as is. its broken, thats why im here:

find /home/user/backups/*.gz * -mmin +60 -exec rm {} \;

Im suspecting its that last backslash should be /home/user/backups/ And also I should remove the * before -min

so what I need should be:

find /home/user/backups/*.gz -mmin +60 -exec rm {} /home/user/backups/; 

Am I correct? Or still missing something? BTW Im running this on Dreamhost shared hosting CRON. Their support don't want to help with BASH questions really, I tried.


Solution

  • The filename arguments to find should be the directories to start the recursive search. Then use -name and other options to filter down to the files that match the criteria you want.

    find /home/user/backups -type f -name '*.sql.gz' -mmin +60 -exec rm {} +
    

    And using + instead of \; at the end of -exec means that it should just run the command once with all the selected filenames, rather than separately for each filename; this is a minor efficiency improvement.