I want to execute ClearML task remotely. According to docs there are 2 options: 1) execute single python file; 2) ClearML would identify that script is part of repo, that repo will be cloned and installed into docker and executed on the worker.
In this second scenario it is assumed that repo has remote url and it is accessible by worker. What if it isn't the case? Is it possible to somehow pack the local repo and send it for remote execution.
I think it is somewhat extending scenario 1, where not a single file is passed for execution but whole directory with file in it.
PS: i understand reproducibility concerns that arise, but repo is really not accessible from worker :(
Thanks in advance.
Disclaimer: I'm a team members of ClearML
In this second scenario it is assumed that repo has remote url and it is accessible by worker. What if it isn't the case? Is it possible to somehow pack the local repo and send it for remote execution.
well, no :( if your code is a single script, then yes ClearML would store the entire script, then the worker will reproduce it on the remote machine. But if your code base is composed of more than a single file, then why not use git? it is free hosted by GitHub, Bitbucket, GitLab etc.
In theory this is doable and if you feel the need, I urge you to PR this feature. Basically you would store the entire folder as an artifact (ClearML will auto zip it for you), then the agent needs to unzip the artifact and run it. The main issue would be that cloning the Task will not clone the artifact...