Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

ASCII and EOF

Name: Anonymous 2011-09-28 7:40

Under the standard ASCII set, the most significant bit of each character is 0. If so, why doesn't the EOF constant just use the highest possible value? to provide support for extended character sets? surely a single character could be sacrificed in favor of being able to recognize the EOF signal without using an integer to store a character?

Name: Anonymous 2011-09-28 7:54

This is prog. No one will take you seriously

LISP sucks
Haskell sucks

Java forever

Hail satan. Hail niggers.
Heil Black Hitler. Glory Black Afrikka.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List