>>18
Reference counting does this within context, and at the right moment. GC does not, it saves things up until the most inconvenient moment, and then starts guessing what should be done, taking amounts of time, most of which is being spent on deciding if it is the right thing to do.
GC makes an application behave unpredictable, makes debugging in this area unreproducible, and it introduces cpu load spikes.
I believe that GC is nothing more than some experimental computer science field of research, that had some friends in high places, who were able to let it into the wild and stuff out down our throats.