>>20
You're a troll because you knew from the very beginning the purpose of the example and the purpose of the thread in general, yet you've insisted on being picky about issues which are ludicrously obvious to most of us, and which were evidently discarded as not pertaining to the discussion.
Thus, either a troll or a complete illiterate retard. I most respectfully prefer to judge you as the former.
>>24
What. Excuse me? Non-deterministic behavior solved by C preprocessor? What the fuck are you talking about?!
>>22
TL;DR: Don't over-interpret standards.
Standard-wise compilers can generate different code not only in the case of undefined behavior, but also when compiling compliant code, provided that the result meets with the observable behavior of the source code. Theoretically a compiler could insert a number of useless operations anywhere in the code, if these operations wouldn't change the output of the program. Even strictier: the compiler could
refuse to compile a piece of compliant code, simply because the C standard does not say it should correctly compile
every piece of compliant code!
Let me elaborate a bit what I'm trying to say.
Often with good intentions and in response to the demand of writing good, correct programs, people learn to be exaggeratedly picky and overconcerned about standards. While this is in general much better than not paying attention to the documentation at all, being ridiculously pedantic and stubborn with regards to every single letter written on an ISO document is not any good either.
Standard documents have the goal of being comprehensive, authoritative and remarkably precise references in a given subject and in the context they're devised. They must clearly give solutions to the problems they're destined to tackle, and disambiguate and answer any possible question that might arise. For that reason these documents are often extremely elaborated and carefully crafted, in the very linguistic level, to afford this degree of exactness.
However, it is this same exactness which suggest that people should be mathematically precise when relating to these documents, when in reality, there are also very practical goals which a standard tries to achieve, and sometimes these goals are not expressed in the document, mostly because these goals are rather ineffable and it would take too much verbosity to explain and enforce them. Yet these goals are fairly obvious to anyone marginally insighted into the subject.
For example, the C standard does not specify how exactly the operations in the source code should be done in the machine level. It does not explicitly state that the compiler ought to generate, let's say, the most succint or the most "natural" way of executing a given statement, simply because it is near to impossible to define these things in the general case. This allows for the behavior I've exemplified: a compiler could insert an arbitrary number of NOPs before every other sensible instruction it yields.
Now, the question is,
should a compiler do that? Obviously not. While the C standard does not make such a compiler illegal, in practice this kind of behavior is simply nonsense. The committee has not written that explicitly, but they surely hadn't such a "design goal" in mind when they joined efforts to standardize the language. It's a practicality not liturgically written in the document, but nonetheless an issue everyone is well aware of.
In other words, some sort of good taste is necessary when interpreting standard documents. They're
not mathematically precise, they will
never be, and they were
not designed with that purpose in mind in the first place. The text should never be over-interpreted, tested against every possible piece of dreamy interpretation one could think of.
Can a compiler produce random output based on some sort of generator or astrological prediction? Yes, it could.
Should a compiler do that? Most likely not. Would anyone use such a tool in the general case? Well, I surely would not. These insanities are simply a disservice against their users. Compilers ought to be as predictable as possible: given the same input, it should not vary its output, unless there's a good reason for that. Compilers should not impact their program's performance
arbitrarily, either. One's output is another one's input: a compiler which does not offer good, consistent behavior in its output will turn the life of linkers, interpreters, loaders and kernels an incompatibility hell.
Some people object with things like, "Well, the standard does not define the execution character set of programs, how can one possibly write a compliant 'Hello World' program? What if the compiler generated ROT13 representations of the EBCDIC character set as output?" That's just stupid.
No one would ever design such a tool, at least not
expecting it to be employed for anything useful at all. Should you care, thus, about strings of your output being "accidentally" replaced with ROT13 EBCDICs by your enemy-compiler? Of course not. Your code is still fully compliant as far as any standard is concerned.
So, don't be insanely obsessed with what standards allow and do not allow. When writing compliant code one should always try to be up to par with what is explicitly allowed, but look no further if you're completely crippled by what is explicitly and strictly allowed. For example, do not worry whether your high-performance code will be filled in with NOPs because, well, you know, the current date/time of your machine is a multiple of 7
and your compiler could decide it's a good reason for trolling you.
Note that when one is writing code with performance in mind, one is
implicitly assuming the compiler won't arbitrarily mess with his work, and this assumption
could drive the idea that you're actually writing code which depends on compiler behavior!