I've been working on a bunch of image processing programs.. nothing fancy, mostly experimenting quick and dirty. The image data is stored in vectors which are declared on the stack (I try to avoid having to use pointers when I don't need to pass data around). I've noticed that some of my functions have been behaving very strangely despite countless amounts of debugging and stepping. Sometimes the debugger would give me an error that it cannot evaluate a certain variable, among other things. Things generally just do not make sense, and past experience tells me that when this happens this means that there is some kind of overflow or memory corruption going on. The first thing that came to mind was that it was probably due to me storing lots of image data into vectors.
However, I was under the impression that vectors store their actual data in the heap, and so I thought it wouldn't hurt to have a few of these large vectors on the stack. Am I wrong in thinking this? Should I be allocating my vectors and storing them in the heap rather than the stack?
Thanks,
[...]vectors store their actual data in the heap
vector
, like all other containers, uses an allocator object for memory management. Typically, if you don't specify anything as the template's second parameter, the default allocator -- std::allocator
from <memory>
-- is used. It is the allocator's responsibility to reserve memory. It is free to do so either from the free-store or on the stack.
Most implementations typically use the pimpl idiom and store a pointer within the vector
object which points to the actual memory on the free-store.
I've noticed that some of my functions have been behaving very strangely despite countless amounts of debugging and stepping
You may want to check that you are using your vector
s properly. Look up the standard as to what gurantees you get with each member function, what conditions must be satisfied for the contained types and when your iterators get invalidated. That should be a good start.