>>13
Over the past few years, I have noticed that beyond K&R's spec, rationale for features in the various C standards has tended to kowtow to `existing code' in some regard, rather than what should be right. For example, I was just reading in c.l.c. that somewhere along the line the behavior of
malloc(0) was decided (you can look it up, but the gist is `must compare inequally to any other pointer, including 0-sized ones, but cannot be dereferenced'), and that the reason given was to support people implementing dynamic arrays and starting with 0 as the array length.
This isn't a huge problem I have with the language. It's in a corner very close to undefined behavior, and the behavior isn't too bad, but I see it as a flaw in C99 that previous versions of C didn't have.
I haven't had the time to sit down and fully read the rationale for C11 (or to fully read the rationale for any revision, honestly), so I'm not sure how true this next point is, but I've heard that most of the changes in recent versions of C are motivated simply by the desire to introduce syntactic and semantic differences between C and C++ so that people will stop saying C is a subset of C++. Again, I believe this leads to technically inferior decisions.
I've tended to use C89 as my default compiler argument over the past few years, and while there are lots of things I dislike about it with it (
gets() exists, which C11 fixed, no anonymous unions, no
stdint.h unless I bring my own), I've never found a compelling reason to actually change it: no argument that using a later standard allows the compiler to make my code faster, no safety arguments that can't be reduced to `think about your inputs', etc. If I'm going to stop writing
-ansi in my makefiles just because something else is new and has more features, why don't I just switch to Sepples while I'm at it so that I can get lambdas and better pointers and over-engineered templates?