Under the standard ASCII set, the most significant bit of each character is 0. If so, why doesn't the EOF constant just use the highest possible value? to provide support for extended character sets? surely a single character could be sacrificed in favor of being able to recognize the EOF signal without using an integer to store a character?
Sure it is full of trolls, idiots, and idiotic trolls, but within these dark lands, hidden beneath the sea of shit, dwell true wizards. And all is worth it for those few, precious, moments.
Commodore and Ubuntu targeted blacks -- products offering simplicity as a selling point.
Name:
Anonymous2011-09-28 10:12
>>3
LoseThos documents (source code) have ASCII first, a terminating zero and binary data. There are library functions which will conveniently load and save these documents. You just work with the structures in memory.
I held-back from using all 256 values but recently went with extended screen codes. I gotta innovate or die. If it's not different and better, I lose.
>>8
Sounds awesome, an OS that dictates document formats. Microsoft should have thought of that and forced everyone to use MS Word documents for everything.