>>26
What an idiot.
Perhaps the best simple example to illustrate, is its invention of the format function printf. Completely
ad hoc,
So what?
inflexible,
Wrong.
cryptic syntax and semantics.
You never read the fucking manpage, did you?
and can't do arbitrary n-based number system
Because 99% of the time you print out a number it's going to be in decimal. The other 1% in hex or octal.
It can print decimal in several formats, but in a bunch of ad hoc fixed ways.
Same as above. No one cares about printing decimal numbers with their digits backwards or some other obscure shit.
many programers don't really understand what's n-based number system,
Nor does it matter.
And if you show them hex number system using decimal digits in a list, they would be wildly flabbergasted and goes “WHY would you ever want do that??”
Then how about you explain why, genius?
Instead of working on a better compiler, let's invent a short syntax on the spot!
How does a better compiler save keystrokes?
to this day, there are programers who don't understand the difference between a set of true/false formal parameters vs their internal representation, and insists that bitmask is the most efficient way to “encode” boolean parameters.
This sounds like a guy who was too stupid to understand bitmasks properly. Something a 9-year-old kid could probably understand with a bit of teaching.
What's with these idiots who think computers are some magic thing.