I'm trying to generate a reasonably sized core dump of a running .net core process using gcore
, but the file is larger than 20GB.
The process is dotnet wapi.dll
which is the binary of an empty project created using dotnet new webapi
.
I think size of the dump is related to amount of virtual memory.
The main question is how can I generate a smaller core dump?
Is this related to what I'm thinking of (virtual memory)?
Should I limit the virtual memory? how?
I found the easiest way to do this is to use createdump
utility which comes with dotnet runtime and is located in the same directory as libcoreclr.so. (thanks to Maoni Stephens).
Using createdump
is pretty easy:
createdump [options] pid
-f, --name - dump path and file name. The pid can be placed in the name with %d. The default is "/tmp/coredump.%d"
-n, --normal - create minidump (default).
-h, --withheap - create minidump with heap.
-t, --triage - create triage minidump.
-u, --full - create full core dump.
-d, --diag - enable diagnostic messages.
Read more about createdump here
Another option is to use dotnet-dump global tool which you can read about it here.
On Linux, the runtime version must be 3.0 or greater. On Windows, dotnet-dump collect will work with any version of the runtime.
Because I was running v2.2 then I was unable to use this tool.