I want to use git-annex as part of a sync and backup solution. There are several devices connected to a single git-annex repository, and a few special remotes as backup solutions (s3, external drives).
Sometimes I don't need a file or directory tree anymore, and want to drop it. As the content is still available in the special remotes, I can safely do that. But it still consumes space there, and this is annoying, especially for large files. I therefore would like to remove the content of all old files, which where delete e.g. more than 4 weeks ago, from my s3 special remote.
How do I permanently delete the content of old removed files including all previous versions from a special remote?
I do not believe that you can do this with git-annex as it stands, based on my reading of the MATCHING OPTIONS in the git-annex man page. Note that there are no options that consider age.
Therefore, I guess that you would have to write a script to use git annex unused
to list the unused files, and then compute their age by a search in the git log.