protocol-buffersgrpc

gRPC memory usage during stream


According to this answer: stream reduces the maximum amount of memory required to process each message

I wonder when sending a large file by separating it into chunks then sent through a gRPC stream, do all the chunks needs to be hold in memory until the whole process is done?

If a file with size X MB needs to to be passed though multiple services(means sending and receiving in between multiple times), does that mean theoretically each service will need to have 2X MB memory available?


Solution

  • do all the chunks needs to be hold in memory until the whole process is done?

    No; the stream can even be theoretically infinite - all that is rooted by gRPC is the current message; that's basically the point of streaming. Of course, if your code roots them (for example, adding everything to a list/array), then... bad things.