[pro] Is cl-yacc going to cut it?
Pascal J. Bourguignon
pjb at informatimago.com
Fri Feb 4 16:06:04 UTC 2011
Paul Tarvydas <paul.tarvydas at rogers.com>
writes:
>> symbol parse, but is there another general parsing technique, available
>> as a lisp library of course, that either works at a lower level than
>> yacc usually does or allows the lexer to access more context about the
>> parse?
>
> The relatively new PEG packrat parser technologies make it possible to
> use just one universal description for, both, scanning and parsing. I
> see that cl-peg exists, but I haven't tried it out.
Well, if we may distract the OP from cl-yacc, I'll note that Zebu
contains also a (unoptimized) lexer (it just uses regexps naively to
match tokens).
I would also note that given that context free languages include regular
languages, there's also little theorical point in distinguishing a lexer
from a parser: you can describe the tokens using normal grammar rules.
space-or-comment := { space | comment } .
comment := '#' '|' { comment-chars } '|' '#' .
comment-chars := '|' not-sharp | not-pipe .
not-sharp := space | letter | digit | '|' | '+' | '-' | ... .
not-pipe := space | letter | digit | '#' | '+' | '-' | ... .
identifier := space-or-comment letter { digit | letter } .
etc, so basically the only lexer you need is READ-CHAR.
--
__Pascal Bourguignon__ http://www.informatimago.com/
A bad day in () is better than a good day in {}.
More information about the pro
mailing list