Name: Anonymous 2011-09-28 7:40
Under the standard ASCII set, the most significant bit of each character is 0. If so, why doesn't the EOF constant just use the highest possible value? to provide support for extended character sets? surely a single character could be sacrificed in favor of being able to recognize the EOF signal without using an integer to store a character?