I am using std::bitset and I try to create two arrays std::bitset with size 100,000,000,000. As a result, program fills only 298 MB of my RAM but must to fill ~24 GB. I have 32 GB RAM and now 26 GB are free. When I build my code for x86, it compiles and can start but for x64 it shows me the following: error C2148: total size of array must not exceed 0x7fffffff bytes. How to fix it and do not reduce size of bitset arrays?
I have tried to make 2 global arrays. Also, I have set in Microsoft visual studio -> project -> name_project properties -> configuration properties -> linker -> system -> Stack Reserve Size to 25,000,000 (i think that there must be KB, so i think i have set to ~25 GB).
... // other libraries
#include <bitset>
std::bitset <100000000000>mas;
std::bitset <100000000000>a1;
int main() {.../* work with the arrays */...}
I want to run the code with the huge std::bitset arrays.
UPD: for x86 ok, but what for x64? my code checks whole arrays and at one moment it stops.
std::bitset
must, at a minimum (and most implementations use the minimum), use one byte per eight bits to be stored. For 100 billion bits, that means you need ~12.5 GB of memory per bitset. Problem is, on a 32 bit system, your maximum virtual memory size for the whole program is at most 4 GB. Some of that is eaten by the kernel memory reservation for the process, so odds are you only have 2 GB of virtual address space to use; you're trying to use six times that much.
Your program cannot run on a 32 bit system without shrinking the bitset
. If it claims to run, it's likely a different error; it would be truncating 100_000_000_000
to fit in a 32 bit size_t
, creating a std::bitset<1215752192>
instead, which would only require ~150 MB of memory, causing no problems. That would explain your observed memory usage of 298 MB; the memory usage display uses "mebibytes" (base-2, not base-10, so KiB == 1024 and MiB == 1048576), which makes each array consume just under 145 MiB, 290 MiB total for two of them, leaving 8 MiB for the rest of your program (which seems reasonable).
If it is in fact dying on x64 with that error, you're stuck; whoever implemented your std::bitset
(or whatever data structure backs it, e.g. std::array
) limited it to 0x7fffffff
bytes even on a 64 bit system, which would limit you to std::bitset
s of around 17 billion bits or less. Your only option would be to find a different provider for your standard library, or reimplement it yourself.
Update: Apparently you're using Windows, and the limit on static data size is 2GB (0x7fffffff bytes), even on 64 bit Windows; the Windows Portable Executable file format (used by .exe
and .dll
files) uses 32 bit offsets and lengths for each section, even for 64 bit executables. By using global std::bitset
s, you're trying to store 25 GB of static data in the image, which won't work. Moving them to the stack (declaring them non-static
inside the body of main
) might work if you increase the Stack Reserve Size like you did, but it's still a bad idea to rely on a stack that large. I'd suggest simply dynamically allocating the bitset
(e.g. auto mas = std::make_unique<std::bitset<100000000000>>()
), or finding a better way to do this with a smaller bitset
.