I'm working on a Go project and experimenting with organizing code into packages. Specifically, I’m wondering:
I’ve already observed that it increases build time significantly, but I want to know if it affects the speed or memory usage when the program is running.
Does the number of Go packages in a project affect the runtime speed or resource consumption of the compiled binary?
Does the number of Go packages in a project affect the runtime speed or resource consumption of the compiled binary?
Short answer: No
Long Answer:
The compiler of Go is a beast that make all the necesario code optimizations for able to run. But what about your question how does Go Compiler Toggle that sceanrio that we have a lot of packages in a go application.
The reason why is taking time for building the compiled file is because is making all the optimizations:
From Go Docs:
Deep export data is simpler for build systems, since only one file is needed per direct dependency.
However, it does have a tendency to grow as one gets higher up the import graph of a big repository: if there is a set of very commonly used types with a large API, nearly every package’s export data will include a copy. This problem motivated the “indexed” design, which allowed partial loading on demand. (gopls does less work than the compiler for each import and is thus more sensitive to export data overheads. For this reason, it uses “shallow” export data, in which indirect information is not recorded at all. This demands random access to the export data files of all dependencies, so is not suitable for distributed build systems.)
That is why there is no issue if you having multiple packages. But yeah It will get slower at time of compilation but It will not increase the resources when is build.