I think we can all agree that C++ is a terrible language. So why is it still around?
When talking to most C++ users (game developers, systems programmers), I've found that most seem to recognize C++'s faults, but they don't really care. They aren't even the slightest bit interested in a new language that might solve its problems, even one that gives them all the power of C++ with none of the downsides. You can't even get them to look at something new.
Why is that? Why does everyone just 'live with it' without wanting to improve the situation?
mostly I just want to believe that it's possible to do it without sacrificing anything.
I prefer to deal with this sort of thing myself, so I think there is a sacrifice. Putting it in the type gets you part way, but that will quickly fall apart in (say) C's type system.
The point is that you do not need to significantly change code in order to remove the garbage collector; it is not as though you are rewriting the program, just annotating it
The way you put it originally made me think you meant the compiler should refactor the program with calls to memory management facilities inserted instead of giving responsibility to the programmer.
My only argument is that some allocations would require refactoring to deal with outside of GC, which is able to handle runtime non-deterministic de/allocations which outlive their creation context. Anything that can be solved simply by annotating could probably be handled by the compiler, couldn't it?
The idea is that 99% of the time, you would need to do the check for nil anyway
I don't buy that. Nullable pointers show up out of their initialization context probably at least 50% of the time. So far so good, but it's not true that says every time one is received or passed out of context it is in danger of being null. A good deal of my code checks these once and never has to worry about it again. A lot of my C never checks (because there is no non-nullable type, and in these instances a problem would preclude creation.) Without analysis a compiler couldn't know when this would be okay.
I would go the other way and say that 99% of the automatically inserted checks, barring this kind of optimization, are provably unnecessary. The real figure is probably much better, but I doubt the real-world optimized case approaches parity with the manually checked case (limited to the cases where the manual checking is indeed correct.)
Somewhere there is a paper about all of this, with statistics. The paper will say that the analysis is possible up to the point (or perhaps surpassing) of program complexity that a human can be expected to deal with, according to some metric or estimate. Whether that kind of analyzer is found in implementation is perhaps another story.
Doesn't seem that tall. What are the holes in my reasoning?
I don't fully understand your reasoning, but it sounds like you're pimping a static solution to memory leaks in the face of dynamic allocation. I'm probably mistaken here, but if that is what you mean then I'd say it's impossible.