I'm hosting a git repo on a shared host. My repo necessarily has a couple of very large files in it, and every time I try to run "git gc" on the repo now, my process gets killed by the shared hosting provider for using too much memory. Is there a way to limit the amount of memory that git gc can consume? My hope would be that it can trade memory usage for speed and just take a little longer to do its work.
Yes, have a look at the help page for git config
and look at the pack.*
options, specifically pack.depth
, pack.window
, pack.windowMemory
and pack.deltaCacheSize
.
It's not a totally exact size as git needs to map each object into memory so one very large object can cause a lot of memory usage regardless of the window and delta cache settings.
You may have better luck packing locally and transfering pack files to the remote side "manually", adding a .keep
files so that the remote git doesn't ever try to completely repack everything.