Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Lexical Analysis: Hard Parts

Name: Anonymous 2013-07-15 16:58

How does one do lexical analysis (tokenization) of Bash/Perl style string, where insertion in the middle is possible?
Say we have
print "Now is {get "time of day"} of {get "current date"}..."
print 'have a nice day!'

And {…} inserts value in the middle of a string.

How Lexer would know which double-quote closes the string and which is part of the string?

Python uses format-like routines, because parsing such strings is hard.

Name: Anonymous 2013-07-15 20:11

HMMM LET'S SEE WE CREATE A STATE MACHINE THAT GOES THROUGH EACH CHARACTER IN THE FUQIN STRING AND WHEN WE ENCOUNTER A FUQIN QUOTE, WE INCREMENT A VARIABLE CALLED QUOTE_COUNT OR SOME SHIT LIKE THAT.  AND...

WAIT, I DON'T FUCKING KNOW WHAT I'M DOING
FUCK.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List