Build systems like Make or Ninja should be able to handle the dependencies and re-compile only the part affected by changes. Then why ccache helps?
I understand that ccache and build systems work in different ways. ccache looks into the file content while build systems only care about files. So change a file and change it back, ccache works, sure, but I assume that is not the main use case? And similarly it works after make clean && make
.
But is there a case that ccache really solves the problem that build systems can't handle? Like the same file is really compiled many times in a project? Or a case where build systems cannot resolve the dependencies and decide to re-compile every time? Real world use cases are preferred. Thanks in advance :)
But is there a case that ccache really solves the problem that build systems can't handle?
Ccache (by default) use a global cache. Same file compiled in two different workspaces (assuming same output, e.g. compiler flags) is put and gotten from the same cache. Neither Make nor Ninja can do that.
Though probably irrelevant for most, ccache size can be limited (default 5GB), replacing files by least recently used. Combined with having the build system removing intermediate files (e.g. object files used to build a library or executable) you could end up with smaller disk space footprint at almost equal performance.
Ccache may also compress the files, reducing disk footprint further.
Finally, ccache can have its cache located remotely: https://ccache.dev/manual/latest.html#_remote_storage_backends
Make and Ninja are (in general) fooled by timestamps changes such as when checking out different branches of development. Ccache is not.
Note: You can't truly generalize "build systems" when it comes to ccache. Some have their own built-in cache system, e.g. Bazel. Having ccache as well would be redundant.