Using {targets} to manage a workflow, which is great.
We don't have a proper cluster setup, but I have access to remote machines with much better specs than my laptop, so I can use git to keep the plan in sync locally and remotely.
When I want to work with something locally, I use rsync
to move the files over.
rsync -avxP -e "ssh -p ..." remote:path/to/_targets .
When I query the remote cache with tar_network
, I see that a bunch of my targets are "uptodate".
When I query the local cache after the rsync
above, those same targets are "outdated".
I'm wondering if there is either better calls to rsync
or certain arguments to tar_network()
, or if this is a bug and the targets should stay as "uptodate" after an rsync
like this?
OK, so I figured this out.
It was because I was being foolish about what was a target in this case. To make sure that a package dependency was being captured, I was using something that grabbed the entire DESCRIPTION
of the package (packageDescription()
I think). The problem with that is, is that when you install the package using remotes::install_github()
, is that it adds some more information to the DESCRIPTION
upon installation (packaged
and built
fields), and that information differed between the installation on my local machine and the installation on the remote machine.
What I really wanted was just the GithubSHA1
bit from the packageDescription()
, to verify that I was using the same package at the same commit from my GitHub repo. Once I switched to using that instead of the entire DESCRIPTION
, then targets
had no issues with the meta information and things would stay current when rsync
'ing them between machines.