Why is there a segmentation fault? I want to find all the factors of a number and put them in a vector. I have another function that does the same exact thing except it uses a while loop. So I thought I would try recursion. "i" initially starts at 1 unless I put some other value in the main.cpp. The "cout i" line is just there so I can see where it fails.
void recurfact ( std::vector <int> & facts, int numb, int i )
{
std::cout << i << std::endl;
if ( i > numb )
{
return;
}
if ( numb % i == 0 )
{
facts.push_back(i);
i = i + 1;
recurfact ( facts, numb, i );
}
else
{
i = i + 1;
recurfact ( facts, numb, i );
}
}
So this works if I test it with numbers less than 42800 +/- 100. If I try any number larger than that it just stops. The debugger says there is a segmentation fault. If I comment out the push_back line it still crashes at that i value.
However if I start with i = 45000, I can test numbers from 45000 to 85000 without a problem. Higher than 85000 it crashes.
I like to know why this occurs.
Compiling with gcc in cygwin on windows 7.
Error message from gdb is:
Program received signal SIGSEGV, Segmentation fault. 0x000007fefcec10d6 in WaitForSingleObjectEx () from /cygdrive/c/Windows/system32/KERNELBASE.dll
See this document about segmentation fault on Unix type systems:
http://www.cs.nyu.edu/exact/core/doc/stackOverflow.txt
Under Unix-like systems, programs may throw a "Segmentation Fault" error. This can be due to stack overflow, especially from recursive function calls or huge data sets. In our demo program "Pi" (see "$(CORE_PATH)/progs/pi"), we compute Pi to any number of desired bits or digits. Here are some test results on when stack overflows will occur on different platforms, using their default stack sizes.