>>20
Code does not have to make assumptions about the limits of integer types; it can consult the constants from <limits.h>.
Rendering integers useless. If your program has to work with the smallest integer types just in case, because you can't make assumptions on the size of integers, I guess you'd end up doing using long always and assuming no more than 16 bits.
>>25
Toy you say? Can't you see it? The second I knew older architectures had 9 or 18 bit bytes, I went like House's picture of "WANT". You know, the failure of Unicode 2.0, adding surrogate pairs to UCS2 to become UTF-16, the 21-bit Universal Character Set, the inefficient UTF-32, the early arrival of carry-based 64 bit numbers on 32 bit architectures because 4 GB is just not enough, and so many hassles wouldn't have been necessary if we had 18 bit bytes. I luv 18 bit bytes. I'm going to fap tonight to a 18/36-bit architecture. In fact, if I were back to the 60s and I were in charge of designing the architecture of the future with all I learnt from the future, I'd make it a 36 bit machine with 18 bit bytes. DO WANT.