Related articles |
---|
Input buffer overflow in lex przemek@viewlogic.com (1993-01-04) |
Re: Input buffer overflow in lex... johnl@iecc.cambridge.ma.us (John R. Levine) (1993-01-05) |
Re: Input buffer overflow in lex... vern@daffy.ee.lbl.gov (1993-01-05) |
Re: Input buffer overflow in lex... richw@sol.camb.inmet.com (1993-01-06) |
Re: Input buffer overflow in lex... finger@convex.com (1993-01-08) |
Newsgroups: | comp.compilers |
From: | finger@convex.com (Jay Finger) |
Organization: | CONVEX Computer Corporation, Richardson, Tx., USA |
Date: | Fri, 8 Jan 1993 00:12:02 GMT |
References: | 93-01-009 93-01-024 |
Keywords: | lex, comment |
richw@sol.camb.inmet.com (Richard Wagner) writes:
>In the version of "lex" I use, the generated lexer gets its input via an
>"input" macro. One solution, which may be viewed as a "kludge" or "hack",
>is to [redefine the input() macro]
I don't remember where I read this, but some piece of of AT&T
documentation I once read documented the input macro and suggested that it
be redefined to fit your own application if you thought that was
appropriate. I have done this many times in the past, but all of the code
I have with me now uses flex (which uses a different input mechanism
completely).
So, while it me be a kludge, it's at least an "official kludge". :-)
jay
--
Jay Finger - finger@convex.com | Convex Computer Corp
CONVEX System Integration and Support | 3000 Waterview Parkway
Product Specialist | Richardson, TX 75080
[In the O'Reilly "lex & yacc" there are several examples of redefining the
lex input macro, since there isn't any other way to change the input source.
Flex lets you redefine YY_INPUT which reads a chunk of input, which is a lot
easier since you don't have to worry about pushback as you do with input().
-John]
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.