I've just read this:
http://www.artima.com/lejava/articles/azul_pauseless_gc.html
Although I've some experience with compilers, I've done nothing related with garbage collection; is a big black box to me.
I've struggled to understand the issues mentioned by the article. I understand the problem (there's a pause when executing most garbage collectors), and I understand that they claim that their implementation doesn't have that problem. But I don't understand why/how the problem happens in the first place (that much seems to be assumed to be understood on the original text), and in consequence I don't get why their solution might work.
Can someone explain to me:
I tend to understand this kind of things better when explained graphically - probably a small memory schema done with the code editor would suffice.
They talk about the pause that inevitably occurs when compacting the heap. You see, when you allocate and deallocate lots of objects of different sizes as you go, you fragment the heap (much like you fragment your harddrive). When fragmentation becomes too extreme, you have to clean up/defragment/compact the heap by reserving a huge chunk of memory, moving all objects there (without any fragmentation) and use their former locations as a fresh chunk of memory without any objects in it, i.e. without fragmentation.
When you do that, you invalidate all references to all objects you moved around. To prevent this, you must prevent that a reference that refers to a pre-compaction object location is used. The by far easiest way to do so is to pause the whole application, move the objects around and then go and update all references. Of course this can incur a significant overhead.
So the solution Azul proposes goes like this: They establish a "read barrier" that allows the GC to intercept dereferencing, and this way they can lazily update the references that are actually used.