>>20
It isn't nessecarily wrong to say a file is x bytes either, as long as you know the number of bits in a byte on the architecture the file was created.
No, on the architecture some particular copy of it is currently stored. Because obviously it would have to be re-encoded if you move it to another architecture, which would change its size.
Measuring file size in bytes is just as wrong as measuring file size in sectors, from the autist's viewpoint. It's not a property of
the file, but only on one particular incarnation of it. So it is
WRONG to call this value "
file size".
Then, when you create other copies this number might change, and to know how it would change you'll need to translate the first pseudo-size to real size in bits, then translate back to the second pseudo-size. As opposed to using the real size all along, and translating it only once in those rare cases when you need to know the size in platform-depended units.
So if you are a proper, self-respecting autist, you simply can't stop at decrying the use of the word "byte" to mean "octet" in the extremely narrow class of programming-related situations. It is
INCONSISTENT to demand everyone to loop to CHAR_BIT but sheepishly measure file sizes in mibibytes.
It's like putting on lipstick, eye shadow, and glittering leggings, heading to a gay bar, but then refusing to suck any cocks. It completely destroys your credibility, it means that you are not a real gay or autist, but only pretend to be one because it's hip.